Nov 26 14:50:31 crc systemd[1]: Starting Kubernetes Kubelet... Nov 26 14:50:31 crc restorecon[4582]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:31 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 14:50:32 crc restorecon[4582]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 26 14:50:33 crc kubenswrapper[4651]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.111877 4651 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120268 4651 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120298 4651 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120305 4651 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120311 4651 feature_gate.go:330] unrecognized feature gate: Example Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120317 4651 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120323 4651 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120330 4651 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120336 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120342 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120347 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120353 4651 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120359 4651 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120365 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120371 4651 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120376 4651 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120382 4651 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120387 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120393 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120398 4651 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120411 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120417 4651 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120422 4651 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120427 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120432 4651 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120438 4651 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120444 4651 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120449 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120454 4651 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120459 4651 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120463 4651 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120468 4651 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120473 4651 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120478 4651 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120483 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120488 4651 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120493 4651 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120498 4651 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120503 4651 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120508 4651 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120513 4651 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120517 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120522 4651 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120527 4651 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120532 4651 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120536 4651 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120541 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120546 4651 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120550 4651 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120556 4651 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120569 4651 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120576 4651 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120581 4651 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120588 4651 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120593 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120599 4651 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120605 4651 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120611 4651 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120616 4651 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120621 4651 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120626 4651 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120631 4651 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120635 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120640 4651 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120646 4651 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120650 4651 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120655 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120660 4651 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120665 4651 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120669 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120675 4651 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.120682 4651 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120785 4651 flags.go:64] FLAG: --address="0.0.0.0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120795 4651 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120804 4651 flags.go:64] FLAG: --anonymous-auth="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120812 4651 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120819 4651 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120825 4651 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120833 4651 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120840 4651 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120846 4651 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120853 4651 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120859 4651 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120865 4651 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120872 4651 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120878 4651 flags.go:64] FLAG: --cgroup-root="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120883 4651 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120889 4651 flags.go:64] FLAG: --client-ca-file="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120895 4651 flags.go:64] FLAG: --cloud-config="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120901 4651 flags.go:64] FLAG: --cloud-provider="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120906 4651 flags.go:64] FLAG: --cluster-dns="[]" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120914 4651 flags.go:64] FLAG: --cluster-domain="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120920 4651 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120926 4651 flags.go:64] FLAG: --config-dir="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120931 4651 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120937 4651 flags.go:64] FLAG: --container-log-max-files="5" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120945 4651 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120951 4651 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120957 4651 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120963 4651 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120969 4651 flags.go:64] FLAG: --contention-profiling="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120975 4651 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120981 4651 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120987 4651 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.120993 4651 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121000 4651 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121006 4651 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121012 4651 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121018 4651 flags.go:64] FLAG: --enable-load-reader="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121024 4651 flags.go:64] FLAG: --enable-server="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121029 4651 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121063 4651 flags.go:64] FLAG: --event-burst="100" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121072 4651 flags.go:64] FLAG: --event-qps="50" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121079 4651 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121085 4651 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121091 4651 flags.go:64] FLAG: --eviction-hard="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121098 4651 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121104 4651 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121109 4651 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121115 4651 flags.go:64] FLAG: --eviction-soft="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121121 4651 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121126 4651 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121132 4651 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121137 4651 flags.go:64] FLAG: --experimental-mounter-path="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121143 4651 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121148 4651 flags.go:64] FLAG: --fail-swap-on="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121154 4651 flags.go:64] FLAG: --feature-gates="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121161 4651 flags.go:64] FLAG: --file-check-frequency="20s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121166 4651 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121172 4651 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121178 4651 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121185 4651 flags.go:64] FLAG: --healthz-port="10248" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121192 4651 flags.go:64] FLAG: --help="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121199 4651 flags.go:64] FLAG: --hostname-override="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121206 4651 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121213 4651 flags.go:64] FLAG: --http-check-frequency="20s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121221 4651 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121228 4651 flags.go:64] FLAG: --image-credential-provider-config="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121235 4651 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121241 4651 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121248 4651 flags.go:64] FLAG: --image-service-endpoint="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121255 4651 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121262 4651 flags.go:64] FLAG: --kube-api-burst="100" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121269 4651 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121276 4651 flags.go:64] FLAG: --kube-api-qps="50" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121284 4651 flags.go:64] FLAG: --kube-reserved="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121291 4651 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121298 4651 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121306 4651 flags.go:64] FLAG: --kubelet-cgroups="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121313 4651 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121320 4651 flags.go:64] FLAG: --lock-file="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121327 4651 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121334 4651 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121342 4651 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121353 4651 flags.go:64] FLAG: --log-json-split-stream="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121360 4651 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121367 4651 flags.go:64] FLAG: --log-text-split-stream="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121375 4651 flags.go:64] FLAG: --logging-format="text" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121382 4651 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121390 4651 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121396 4651 flags.go:64] FLAG: --manifest-url="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121401 4651 flags.go:64] FLAG: --manifest-url-header="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121409 4651 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121414 4651 flags.go:64] FLAG: --max-open-files="1000000" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121422 4651 flags.go:64] FLAG: --max-pods="110" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121428 4651 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121433 4651 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121439 4651 flags.go:64] FLAG: --memory-manager-policy="None" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121444 4651 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121450 4651 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121456 4651 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121462 4651 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121473 4651 flags.go:64] FLAG: --node-status-max-images="50" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121479 4651 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121485 4651 flags.go:64] FLAG: --oom-score-adj="-999" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121491 4651 flags.go:64] FLAG: --pod-cidr="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121497 4651 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121505 4651 flags.go:64] FLAG: --pod-manifest-path="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121511 4651 flags.go:64] FLAG: --pod-max-pids="-1" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121517 4651 flags.go:64] FLAG: --pods-per-core="0" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121523 4651 flags.go:64] FLAG: --port="10250" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121528 4651 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121534 4651 flags.go:64] FLAG: --provider-id="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121539 4651 flags.go:64] FLAG: --qos-reserved="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121546 4651 flags.go:64] FLAG: --read-only-port="10255" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121552 4651 flags.go:64] FLAG: --register-node="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121557 4651 flags.go:64] FLAG: --register-schedulable="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121563 4651 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121572 4651 flags.go:64] FLAG: --registry-burst="10" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121577 4651 flags.go:64] FLAG: --registry-qps="5" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121583 4651 flags.go:64] FLAG: --reserved-cpus="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121588 4651 flags.go:64] FLAG: --reserved-memory="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121595 4651 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121601 4651 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121606 4651 flags.go:64] FLAG: --rotate-certificates="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121612 4651 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121618 4651 flags.go:64] FLAG: --runonce="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121623 4651 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121629 4651 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121635 4651 flags.go:64] FLAG: --seccomp-default="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121641 4651 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121647 4651 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121653 4651 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121660 4651 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121667 4651 flags.go:64] FLAG: --storage-driver-password="root" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121674 4651 flags.go:64] FLAG: --storage-driver-secure="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121680 4651 flags.go:64] FLAG: --storage-driver-table="stats" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121685 4651 flags.go:64] FLAG: --storage-driver-user="root" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121691 4651 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121697 4651 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121703 4651 flags.go:64] FLAG: --system-cgroups="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121709 4651 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121717 4651 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121723 4651 flags.go:64] FLAG: --tls-cert-file="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121728 4651 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121735 4651 flags.go:64] FLAG: --tls-min-version="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121741 4651 flags.go:64] FLAG: --tls-private-key-file="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121746 4651 flags.go:64] FLAG: --topology-manager-policy="none" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121751 4651 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121757 4651 flags.go:64] FLAG: --topology-manager-scope="container" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121763 4651 flags.go:64] FLAG: --v="2" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121771 4651 flags.go:64] FLAG: --version="false" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121778 4651 flags.go:64] FLAG: --vmodule="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121785 4651 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.121791 4651 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121934 4651 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121941 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121946 4651 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121951 4651 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121956 4651 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121962 4651 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121968 4651 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121973 4651 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121977 4651 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121982 4651 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121987 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121993 4651 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.121998 4651 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122003 4651 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122008 4651 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122013 4651 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122018 4651 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122023 4651 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122027 4651 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122054 4651 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122059 4651 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122064 4651 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122069 4651 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122074 4651 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122079 4651 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122085 4651 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122091 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122098 4651 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122103 4651 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122109 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122114 4651 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122120 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122127 4651 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122133 4651 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122143 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122148 4651 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122154 4651 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122159 4651 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122164 4651 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122169 4651 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122173 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122178 4651 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122189 4651 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122196 4651 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122202 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122210 4651 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122217 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122224 4651 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122232 4651 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122239 4651 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122245 4651 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122251 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122258 4651 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122263 4651 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122268 4651 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122273 4651 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122278 4651 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122283 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122288 4651 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122294 4651 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122300 4651 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122305 4651 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122310 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122315 4651 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122320 4651 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122325 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122332 4651 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122338 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122342 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122347 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.122352 4651 feature_gate.go:330] unrecognized feature gate: Example Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.122368 4651 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.136714 4651 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.136754 4651 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136918 4651 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136932 4651 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136943 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136952 4651 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136961 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136971 4651 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136981 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.136993 4651 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137004 4651 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137014 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137023 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137144 4651 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137159 4651 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137169 4651 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137178 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137187 4651 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137195 4651 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137204 4651 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137213 4651 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137221 4651 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137229 4651 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137241 4651 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137254 4651 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137264 4651 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137272 4651 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137281 4651 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137289 4651 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137298 4651 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137306 4651 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137315 4651 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137323 4651 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137332 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137340 4651 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137349 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137366 4651 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137374 4651 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137383 4651 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137391 4651 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137400 4651 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137408 4651 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137419 4651 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137430 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137438 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137448 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137456 4651 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137465 4651 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137473 4651 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137482 4651 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137490 4651 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137499 4651 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137507 4651 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137516 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137524 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137532 4651 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137541 4651 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137550 4651 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137558 4651 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137566 4651 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137574 4651 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137583 4651 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137591 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137599 4651 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137608 4651 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137616 4651 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137625 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137633 4651 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137642 4651 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137650 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137659 4651 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137667 4651 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137677 4651 feature_gate.go:330] unrecognized feature gate: Example Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.137691 4651 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137974 4651 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.137991 4651 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138005 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138014 4651 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138024 4651 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138057 4651 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138065 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138074 4651 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138082 4651 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138092 4651 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138101 4651 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138109 4651 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138118 4651 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138126 4651 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138135 4651 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138143 4651 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138151 4651 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138160 4651 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138169 4651 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138177 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138185 4651 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138194 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138203 4651 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138211 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138220 4651 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138228 4651 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138237 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138245 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138254 4651 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138262 4651 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138270 4651 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138278 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138287 4651 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138295 4651 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138304 4651 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138315 4651 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138327 4651 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138337 4651 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138347 4651 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138356 4651 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138365 4651 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138374 4651 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138382 4651 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138391 4651 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138399 4651 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138408 4651 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138417 4651 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138425 4651 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138436 4651 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138446 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138455 4651 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138464 4651 feature_gate.go:330] unrecognized feature gate: Example Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138473 4651 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138483 4651 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138495 4651 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138507 4651 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138517 4651 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138527 4651 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138538 4651 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138548 4651 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138558 4651 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138568 4651 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138579 4651 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138590 4651 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138600 4651 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138610 4651 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138619 4651 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138629 4651 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138638 4651 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138649 4651 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.138667 4651 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.138684 4651 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.139924 4651 server.go:940] "Client rotation is on, will bootstrap in background" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.144793 4651 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.144894 4651 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.147097 4651 server.go:997] "Starting client certificate rotation" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.147128 4651 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.147348 4651 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 08:54:54.846566741 +0000 UTC Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.147499 4651 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1074h4m21.699072587s for next certificate rotation Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.175321 4651 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.181177 4651 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.200960 4651 log.go:25] "Validated CRI v1 runtime API" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.259716 4651 log.go:25] "Validated CRI v1 image API" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.261993 4651 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.271948 4651 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-26-14-44-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.272208 4651 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.292174 4651 manager.go:217] Machine: {Timestamp:2025-11-26 14:50:33.290149144 +0000 UTC m=+0.715896818 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199472640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f3371023-60f8-48eb-ae28-5202b22521c7 BootID:e2b80ad3-61c0-4ac3-b6ee-104ddf239417 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076107 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599734272 Type:vfs Inodes:3076107 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039894528 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c9:d8:8d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c9:d8:8d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:94:5c:78 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b3:cf:2a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:92:3a:18 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:bc:55:4c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:d1:02:6f:6e:04 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:56:73:28:1e:36 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199472640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.292692 4651 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.292963 4651 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.293719 4651 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.294186 4651 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.294420 4651 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.294949 4651 topology_manager.go:138] "Creating topology manager with none policy" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.295134 4651 container_manager_linux.go:303] "Creating device plugin manager" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.295998 4651 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.296225 4651 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.296524 4651 state_mem.go:36] "Initialized new in-memory state store" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.296761 4651 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.314985 4651 kubelet.go:418] "Attempting to sync node with API server" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.315225 4651 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.315344 4651 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.315377 4651 kubelet.go:324] "Adding apiserver pod source" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.315409 4651 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.320821 4651 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.321972 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.322085 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.322077 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.322178 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.322191 4651 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.326406 4651 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328601 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328653 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328671 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328688 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328713 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328729 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328744 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328771 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328790 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328807 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328839 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.328855 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.330073 4651 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.330756 4651 server.go:1280] "Started kubelet" Nov 26 14:50:33 crc systemd[1]: Started Kubernetes Kubelet. Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.333416 4651 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.333410 4651 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.334116 4651 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.334723 4651 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.334765 4651 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.335077 4651 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:42:10.702084521 +0000 UTC Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.335702 4651 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 53h51m37.366385232s for next certificate rotation Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.335894 4651 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.335930 4651 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.335982 4651 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.336010 4651 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.336601 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.336666 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.338319 4651 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339204 4651 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339231 4651 factory.go:55] Registering systemd factory Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339241 4651 factory.go:221] Registration of the systemd container factory successfully Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339558 4651 factory.go:153] Registering CRI-O factory Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339580 4651 factory.go:221] Registration of the crio container factory successfully Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339600 4651 factory.go:103] Registering Raw factory Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.339620 4651 manager.go:1196] Started watching for new ooms in manager Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.347723 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="200ms" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.348833 4651 server.go:460] "Adding debug handlers to kubelet server" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.354817 4651 manager.go:319] Starting recovery of all containers Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.359870 4651 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b9606699044b3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 14:50:33.330713779 +0000 UTC m=+0.756461423,LastTimestamp:2025-11-26 14:50:33.330713779 +0000 UTC m=+0.756461423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.377358 4651 manager.go:324] Recovery completed Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378144 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378244 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378276 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378331 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378361 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378509 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378561 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378591 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378624 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378653 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378686 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378713 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378739 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378826 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378855 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378936 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378968 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.378997 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379026 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379101 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379159 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379188 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379215 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379240 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379269 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379299 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379338 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379368 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379397 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379425 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379454 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379485 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379511 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379539 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379567 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379594 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379622 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379651 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379683 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379711 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379739 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379766 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379795 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379821 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379842 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379866 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379886 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379908 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379929 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379951 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379972 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.379991 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380020 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380090 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380124 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380154 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380183 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380227 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380253 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380281 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380311 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380340 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380368 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380395 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380423 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380450 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380477 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380503 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380530 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380556 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380581 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380607 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380634 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380663 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380691 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380719 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380745 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380773 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380798 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380824 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380849 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380876 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380901 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380928 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380957 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.380984 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381011 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381076 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381111 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381137 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381209 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381237 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381266 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381294 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381321 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381350 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381378 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381404 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381432 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381459 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381488 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381517 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381545 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381574 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381611 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381644 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381674 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381707 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381735 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381766 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381796 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381827 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381856 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381885 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381910 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381936 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381964 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.381984 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382004 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382115 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382146 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382176 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382204 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382234 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382264 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382295 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382321 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382350 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382378 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382404 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382429 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382459 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382485 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382516 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382543 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.382571 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386839 4651 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386879 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386899 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386915 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386938 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386956 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386972 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.386990 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387006 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387023 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387057 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387072 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387084 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387096 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387143 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387158 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387171 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387188 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387205 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387221 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387234 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387249 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387283 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387295 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387308 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387322 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387335 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387348 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387360 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387373 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387386 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387401 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387413 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387428 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387447 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387479 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387493 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387506 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387519 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387532 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387545 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387558 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387571 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387584 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387597 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387616 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387630 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387644 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387657 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387679 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387693 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387705 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387718 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387730 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387743 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387756 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387773 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387790 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387806 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387823 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387841 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387856 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387869 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387882 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387895 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387909 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387920 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387933 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387947 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387960 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387972 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387985 4651 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.387998 4651 reconstruct.go:97] "Volume reconstruction finished" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.388006 4651 reconciler.go:26] "Reconciler: start to sync state" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.392423 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.393801 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.393866 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.393891 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.394596 4651 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.394673 4651 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.394742 4651 state_mem.go:36] "Initialized new in-memory state store" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.397540 4651 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.400705 4651 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.400750 4651 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.400779 4651 kubelet.go:2335] "Starting kubelet main sync loop" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.400827 4651 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.401545 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.401624 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.436146 4651 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.438907 4651 policy_none.go:49] "None policy: Start" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.439904 4651 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.439990 4651 state_mem.go:35] "Initializing new in-memory state store" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490155 4651 manager.go:334] "Starting Device Plugin manager" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490237 4651 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490255 4651 server.go:79] "Starting device plugin registration server" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490659 4651 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490678 4651 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.490987 4651 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.491098 4651 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.491107 4651 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.497174 4651 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.501550 4651 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.501726 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.503476 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.503508 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.503517 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.503658 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.504146 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.504241 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505160 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505191 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505210 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505320 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505523 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505565 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505844 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505895 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.505924 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506597 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506617 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506630 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506910 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506937 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.506949 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.507095 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.507478 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.507519 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508306 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508340 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508349 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508452 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508840 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.508860 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.509231 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.509247 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.509255 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.509985 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510008 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510129 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510168 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510189 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510201 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510323 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.510367 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.511510 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.511551 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.511560 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.549025 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="400ms" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591049 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591353 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591392 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591423 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591444 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591465 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591485 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591505 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591528 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591572 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591592 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591610 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591750 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591779 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591800 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.591821 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.592218 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.592242 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.592253 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.592274 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.592508 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693186 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693273 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693309 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693340 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693374 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693404 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693435 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693463 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693485 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693498 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693600 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693641 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693614 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693655 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693661 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693686 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693779 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693798 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693790 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693814 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693827 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693886 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693906 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693911 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693967 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693987 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.694019 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.694078 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.693984 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.694154 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.793188 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.794901 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.794995 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.795011 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.795064 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.795637 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.837500 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.862033 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.869846 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.883553 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: I1126 14:50:33.892918 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.912754 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-040c748c1171d03a5b2800b2011892d14bb834762864e5d017b2d15e4c2b9249 WatchSource:0}: Error finding container 040c748c1171d03a5b2800b2011892d14bb834762864e5d017b2d15e4c2b9249: Status 404 returned error can't find the container with id 040c748c1171d03a5b2800b2011892d14bb834762864e5d017b2d15e4c2b9249 Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.928795 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e8b319f2a67603dfd5c3cd40a775a1bd077916952e23f7d577a448ddfd7b5243 WatchSource:0}: Error finding container e8b319f2a67603dfd5c3cd40a775a1bd077916952e23f7d577a448ddfd7b5243: Status 404 returned error can't find the container with id e8b319f2a67603dfd5c3cd40a775a1bd077916952e23f7d577a448ddfd7b5243 Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.933206 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-602df0f42b59520532e44bd389466a675eabdcad29857b147f2d0039a344a630 WatchSource:0}: Error finding container 602df0f42b59520532e44bd389466a675eabdcad29857b147f2d0039a344a630: Status 404 returned error can't find the container with id 602df0f42b59520532e44bd389466a675eabdcad29857b147f2d0039a344a630 Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.938768 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a4a98d411e89f15226e53f37d828a5a9c38916228bc1aeb2dfd4eb235c5cedfd WatchSource:0}: Error finding container a4a98d411e89f15226e53f37d828a5a9c38916228bc1aeb2dfd4eb235c5cedfd: Status 404 returned error can't find the container with id a4a98d411e89f15226e53f37d828a5a9c38916228bc1aeb2dfd4eb235c5cedfd Nov 26 14:50:33 crc kubenswrapper[4651]: W1126 14:50:33.942379 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3c702e061f27671d6264fccf8e43298c7434f112739f4b5b44598a50c8191bb8 WatchSource:0}: Error finding container 3c702e061f27671d6264fccf8e43298c7434f112739f4b5b44598a50c8191bb8: Status 404 returned error can't find the container with id 3c702e061f27671d6264fccf8e43298c7434f112739f4b5b44598a50c8191bb8 Nov 26 14:50:33 crc kubenswrapper[4651]: E1126 14:50:33.949833 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="800ms" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.196155 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.197812 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.197887 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.197905 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.197941 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.198505 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Nov 26 14:50:34 crc kubenswrapper[4651]: W1126 14:50:34.270596 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.270734 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.339691 4651 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:34 crc kubenswrapper[4651]: W1126 14:50:34.380155 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.380232 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.405426 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3c702e061f27671d6264fccf8e43298c7434f112739f4b5b44598a50c8191bb8"} Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.406436 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a4a98d411e89f15226e53f37d828a5a9c38916228bc1aeb2dfd4eb235c5cedfd"} Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.407851 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"602df0f42b59520532e44bd389466a675eabdcad29857b147f2d0039a344a630"} Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.409222 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8b319f2a67603dfd5c3cd40a775a1bd077916952e23f7d577a448ddfd7b5243"} Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.410336 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"040c748c1171d03a5b2800b2011892d14bb834762864e5d017b2d15e4c2b9249"} Nov 26 14:50:34 crc kubenswrapper[4651]: W1126 14:50:34.469934 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.470105 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.751010 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="1.6s" Nov 26 14:50:34 crc kubenswrapper[4651]: W1126 14:50:34.780854 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.780952 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:34 crc kubenswrapper[4651]: E1126 14:50:34.850527 4651 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b9606699044b3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 14:50:33.330713779 +0000 UTC m=+0.756461423,LastTimestamp:2025-11-26 14:50:33.330713779 +0000 UTC m=+0.756461423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 14:50:34 crc kubenswrapper[4651]: I1126 14:50:34.998897 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.000334 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.000364 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.000372 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.000391 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:35 crc kubenswrapper[4651]: E1126 14:50:35.000780 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.339731 4651 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.416293 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6"} Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.419053 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d" exitCode=0 Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.419249 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.419385 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d"} Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.420606 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.420666 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.420687 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.421447 4651 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7f1e7e2666204191b57e52cdd9795f2faad44057d17bd438e7fe7c0fe83e7a1c" exitCode=0 Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.421512 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7f1e7e2666204191b57e52cdd9795f2faad44057d17bd438e7fe7c0fe83e7a1c"} Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.421599 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.422957 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.423013 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.423068 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.424911 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.426691 4651 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a33194bd316e013e14f2122af3e3b3e7ec5dda9cce167fcd0de830433d1b3719" exitCode=0 Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.426846 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a33194bd316e013e14f2122af3e3b3e7ec5dda9cce167fcd0de830433d1b3719"} Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.426884 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.427220 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.427282 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.427306 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.428384 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.428425 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.428440 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.431122 4651 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fa66b7130f95645cf509e1171c42626af4c5a11e1f65d3b0bf78bc9296519a7d" exitCode=0 Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.431234 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fa66b7130f95645cf509e1171c42626af4c5a11e1f65d3b0bf78bc9296519a7d"} Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.431249 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.432711 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.432758 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:35 crc kubenswrapper[4651]: I1126 14:50:35.432770 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: W1126 14:50:36.202512 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:36 crc kubenswrapper[4651]: E1126 14:50:36.202595 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.339729 4651 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:36 crc kubenswrapper[4651]: E1126 14:50:36.352622 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="3.2s" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.436062 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.436481 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5a84ecb14c115faac446761170a86cfd7232648b54ad9b5c1999376c2351422a"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.437186 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.437218 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.437226 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.439975 4651 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="67a5c3397860072ebd36c4dad787056886370e08ba2106bc4e1fae7c67d88eae" exitCode=0 Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.440050 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"67a5c3397860072ebd36c4dad787056886370e08ba2106bc4e1fae7c67d88eae"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.440107 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.440733 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.440756 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.440764 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.442896 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"259885a3d60b69304822b915bb32f136504387ba02abfcb6eecfa47dec6035cc"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.442944 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5bb940f049c6b07ad9cee98e821ab9828fdfec9a42d7dc4cda7b37ea2505eb55"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.442954 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2a8f9c1ec5ff4382da63f44dc02ce34fecdd32e2f2b108d36d65b939fb998cf2"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.442966 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.443681 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.443703 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.443712 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.445503 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"923210e4f2e95b5f2fb2fc0aa190ba52f4e2aa4029de177c68553a79b064b28b"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.445531 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"33e1c55f35d59813dd0af255b2aba285aaf9feaf7f8db921afd04e0bb6bf8757"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.445544 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5206d7f29dc3a389a62826de4a97864454278e6c3623dcde0fd418a4151e4f7"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.445622 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.446556 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.446582 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.446590 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.448476 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.448517 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.448529 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e"} Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.601446 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.602616 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.602651 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.602659 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:36 crc kubenswrapper[4651]: I1126 14:50:36.602693 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:36 crc kubenswrapper[4651]: E1126 14:50:36.603178 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Nov 26 14:50:36 crc kubenswrapper[4651]: W1126 14:50:36.720222 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:36 crc kubenswrapper[4651]: E1126 14:50:36.720303 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:37 crc kubenswrapper[4651]: W1126 14:50:37.337598 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:37 crc kubenswrapper[4651]: E1126 14:50:37.337693 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.340194 4651 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.452531 4651 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="745a6b3ab5da742eda1979f6bed5ca0f25bc5dfc37c3c092e2d081d7afc4a830" exitCode=0 Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.452570 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"745a6b3ab5da742eda1979f6bed5ca0f25bc5dfc37c3c092e2d081d7afc4a830"} Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.452632 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.453541 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.453570 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.453582 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.459536 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.459988 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460119 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd"} Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460152 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a"} Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460161 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460214 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460241 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460699 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460722 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460732 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.460979 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461004 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461015 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461308 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461402 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461507 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461374 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461617 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.461629 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:37 crc kubenswrapper[4651]: I1126 14:50:37.659955 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.142345 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465129 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"236a2c48bb494fb7766d4c38f49705507f92a7da3490fec47b2ace66ea947952"} Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465185 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e1da97dba31ae5387fd53482ca20b2bb3de0aec68c4c968cff243840a4af01c6"} Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465206 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b0c3f28200744b12c4d5074b4bfac038105210ac550b22c0f6f22b0c41826f0d"} Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465223 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd2f92aaf80ec31af521439a3a7b82d4a85e09327aee5e4346e06dc44102a3df"} Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465226 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465259 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.465340 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466241 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466275 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466288 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466246 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466375 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:38 crc kubenswrapper[4651]: I1126 14:50:38.466386 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.470861 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2f9af87bb4e0374387c8ba4806732809b8d86301c466f72680231694afdc2fc"} Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.470911 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.471385 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.471595 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.471626 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.471638 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.472082 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.472215 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.472296 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.757501 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.757639 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.759179 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.759210 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.759221 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.803610 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.804759 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.804816 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.804834 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:39 crc kubenswrapper[4651]: I1126 14:50:39.804869 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.473331 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.474654 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.474684 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.474728 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.910187 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.910778 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.912480 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.912523 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:40 crc kubenswrapper[4651]: I1126 14:50:40.912540 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:42 crc kubenswrapper[4651]: I1126 14:50:42.236566 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 26 14:50:42 crc kubenswrapper[4651]: I1126 14:50:42.236716 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:42 crc kubenswrapper[4651]: I1126 14:50:42.237681 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:42 crc kubenswrapper[4651]: I1126 14:50:42.237707 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:42 crc kubenswrapper[4651]: I1126 14:50:42.237715 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:43 crc kubenswrapper[4651]: E1126 14:50:43.497246 4651 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.138694 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.138886 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.139904 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.139964 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.139981 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.887907 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.889084 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.890290 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.890324 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.890337 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.897814 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:44 crc kubenswrapper[4651]: I1126 14:50:44.960303 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:45 crc kubenswrapper[4651]: I1126 14:50:45.486104 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:45 crc kubenswrapper[4651]: I1126 14:50:45.487445 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:45 crc kubenswrapper[4651]: I1126 14:50:45.487505 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:45 crc kubenswrapper[4651]: I1126 14:50:45.487528 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:45 crc kubenswrapper[4651]: I1126 14:50:45.491817 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:46 crc kubenswrapper[4651]: I1126 14:50:46.488297 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:46 crc kubenswrapper[4651]: I1126 14:50:46.489395 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:46 crc kubenswrapper[4651]: I1126 14:50:46.489430 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:46 crc kubenswrapper[4651]: I1126 14:50:46.489439 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.490994 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.492057 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.492102 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.492116 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.659982 4651 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.660140 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 14:50:47 crc kubenswrapper[4651]: W1126 14:50:47.676598 4651 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.676756 4651 trace.go:236] Trace[818534317]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 14:50:37.675) (total time: 10001ms): Nov 26 14:50:47 crc kubenswrapper[4651]: Trace[818534317]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:50:47.676) Nov 26 14:50:47 crc kubenswrapper[4651]: Trace[818534317]: [10.001494885s] [10.001494885s] END Nov 26 14:50:47 crc kubenswrapper[4651]: E1126 14:50:47.676790 4651 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.950378 4651 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.950445 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.960494 4651 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 14:50:47 crc kubenswrapper[4651]: I1126 14:50:47.960601 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.798316 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.798560 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.800236 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.800347 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.800368 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:48 crc kubenswrapper[4651]: I1126 14:50:48.873951 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 26 14:50:49 crc kubenswrapper[4651]: I1126 14:50:49.495687 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:49 crc kubenswrapper[4651]: I1126 14:50:49.497256 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:49 crc kubenswrapper[4651]: I1126 14:50:49.497310 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:49 crc kubenswrapper[4651]: I1126 14:50:49.497334 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:49 crc kubenswrapper[4651]: I1126 14:50:49.509067 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 26 14:50:50 crc kubenswrapper[4651]: I1126 14:50:50.497691 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:50 crc kubenswrapper[4651]: I1126 14:50:50.498530 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:50 crc kubenswrapper[4651]: I1126 14:50:50.498560 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:50 crc kubenswrapper[4651]: I1126 14:50:50.498569 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.666339 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.666506 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.667546 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.667572 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.667579 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.671265 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:52 crc kubenswrapper[4651]: E1126 14:50:52.925718 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.928098 4651 trace.go:236] Trace[1761339227]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 14:50:40.604) (total time: 12323ms): Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[1761339227]: ---"Objects listed" error: 12323ms (14:50:52.927) Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[1761339227]: [12.323890028s] [12.323890028s] END Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.928139 4651 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.928811 4651 trace.go:236] Trace[21840107]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 14:50:42.806) (total time: 10122ms): Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[21840107]: ---"Objects listed" error: 10121ms (14:50:52.928) Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[21840107]: [10.122011895s] [10.122011895s] END Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.928838 4651 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 14:50:52 crc kubenswrapper[4651]: E1126 14:50:52.936634 4651 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.938426 4651 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.939456 4651 trace.go:236] Trace[223828282]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 14:50:40.040) (total time: 12899ms): Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[223828282]: ---"Objects listed" error: 12899ms (14:50:52.939) Nov 26 14:50:52 crc kubenswrapper[4651]: Trace[223828282]: [12.899268478s] [12.899268478s] END Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.939488 4651 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.963696 4651 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36314->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.963748 4651 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36320->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.963799 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36314->192.168.126.11:17697: read: connection reset by peer" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.963818 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36320->192.168.126.11:17697: read: connection reset by peer" Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.964402 4651 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 26 14:50:52 crc kubenswrapper[4651]: I1126 14:50:52.964516 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 26 14:50:53 crc kubenswrapper[4651]: E1126 14:50:53.497717 4651 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.505247 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.507012 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a" exitCode=255 Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.507294 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a"} Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.507370 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.508646 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.508694 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.508704 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:53 crc kubenswrapper[4651]: I1126 14:50:53.509246 4651 scope.go:117] "RemoveContainer" containerID="fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.011443 4651 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.327210 4651 apiserver.go:52] "Watching apiserver" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.330812 4651 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.331240 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.331788 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.332400 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.332538 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.332924 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.332991 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.333153 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.333262 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.333362 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.333616 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.338512 4651 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.343597 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.343784 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.343798 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.345504 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347617 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347685 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347752 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347796 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347829 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347866 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347911 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347942 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.347974 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348006 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348054 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348078 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348146 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348179 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348214 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348244 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348249 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348286 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348320 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348355 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348390 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348423 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348459 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348494 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348529 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348567 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348606 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348643 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348680 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348736 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348774 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348831 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348882 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348933 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.348973 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349020 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349096 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349182 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349214 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349253 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349286 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349319 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349350 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349395 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349473 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349507 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349541 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349575 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349608 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349642 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349677 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349708 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349744 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349778 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349858 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349895 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349929 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.349966 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350005 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350068 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350128 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350181 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350243 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350277 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350312 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350346 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350389 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350423 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350459 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350493 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350540 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350574 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350607 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350640 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350672 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350721 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350760 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350794 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350840 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350873 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350910 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350942 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.350975 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351009 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351070 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351105 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351148 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351196 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351245 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351298 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351352 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351404 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351460 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351516 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351573 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351627 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351679 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351733 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351793 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351850 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351905 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.351957 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352010 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352045 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352097 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352160 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352213 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352268 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352332 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352440 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352499 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352557 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352613 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352664 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352716 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352771 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352830 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352883 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352932 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.352984 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353084 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353171 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353225 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353283 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353339 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353397 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353456 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353486 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353509 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353562 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353618 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353618 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353835 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353892 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353929 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.353967 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354003 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354086 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354129 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354168 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354212 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354265 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354322 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354376 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354423 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354476 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354526 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354576 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354627 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354678 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354729 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354788 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354857 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354910 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.354962 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.355023 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357235 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357279 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357317 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357359 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357396 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357434 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357472 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357507 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357543 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357577 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357613 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357650 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357687 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357724 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357760 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357798 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357838 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357877 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357922 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.357964 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358001 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358084 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358130 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358167 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358201 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358236 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358271 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358309 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358346 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358386 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358424 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358459 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358495 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358532 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358567 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358614 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358649 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358685 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358719 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358753 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358787 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358822 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358857 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358894 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.358971 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359033 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359117 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359158 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359201 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359238 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359275 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359312 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359348 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359388 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359426 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359467 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359505 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359547 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.359613 4651 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.363406 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.355460 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.355912 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.356442 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.360297 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.377081 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.377597 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.382430 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.382536 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:54.882513241 +0000 UTC m=+22.308260845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.383519 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.383692 4651 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.384013 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.385912 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.387720 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.387898 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.387979 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:54.887939183 +0000 UTC m=+22.313686887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.360670 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.360764 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.360958 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.360981 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.361277 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.361527 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.361671 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.361745 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362021 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362306 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362379 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362313 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362454 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362522 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362710 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.362773 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.363342 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.365538 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.365611 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.365846 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.365972 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.366115 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.366333 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.366361 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.366834 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.367094 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.367500 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.367901 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.368428 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.368471 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.368562 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.369414 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.369595 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.369659 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.369741 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.370130 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.370223 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.370325 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.370350 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.370853 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.371008 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.371244 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.373297 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.373336 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.373748 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374008 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374451 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374532 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374537 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374577 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.374634 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375366 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375396 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375408 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375678 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375711 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375799 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375927 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375945 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.375976 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.376086 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.376091 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.376350 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.382399 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.391465 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.398393 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.399542 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.401251 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.401379 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.401422 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.401587 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.401662 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402187 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402223 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402549 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402563 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402591 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402610 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402692 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402902 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402945 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.402980 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.403156 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.403394 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.403489 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.403795 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.403817 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.403837 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.403905 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:54.903883738 +0000 UTC m=+22.329631552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.404228 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.404591 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.404949 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.405141 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.405364 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.405439 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.406003 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.406093 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.406488 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.406984 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.407194 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.408147 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.408274 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.408305 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:50:54.908279672 +0000 UTC m=+22.334027366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.409776 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.410017 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.414513 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415068 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415410 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415460 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415587 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415867 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.415944 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.416284 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.416731 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.416999 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.417023 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.417050 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.417117 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:54.917100622 +0000 UTC m=+22.342848436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.417620 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.417842 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.418201 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.418765 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419067 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419413 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.420729 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.420833 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421130 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421164 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421184 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421261 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419431 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419464 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419515 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419540 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419791 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.419976 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421465 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421588 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.421652 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.422191 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.422508 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.425531 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.425825 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.427176 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.430705 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431057 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431137 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431153 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431296 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431580 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431675 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431762 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431820 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.431770 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.432345 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.432615 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.432729 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.432896 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433110 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433163 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433389 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433519 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433725 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.433794 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.434238 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.434292 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.434721 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.434758 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.435734 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.435833 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.436759 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.437728 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.437755 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.439717 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.440296 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.440935 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441618 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441717 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441778 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441810 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441858 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.441934 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.444130 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.444469 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.444558 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.444873 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.445001 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.445026 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.445193 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.446604 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.446790 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.447574 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.447792 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.449759 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.450719 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.450987 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.451174 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.454277 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.455242 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.456392 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.458808 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466577 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466662 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466761 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466783 4651 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466798 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466815 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466829 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466841 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466853 4651 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466869 4651 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466883 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466894 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466906 4651 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466940 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466952 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466964 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466976 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.466992 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467004 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467016 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467032 4651 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467074 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467086 4651 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467097 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467116 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467135 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467148 4651 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467159 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467175 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467187 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467199 4651 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467214 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467227 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467240 4651 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467252 4651 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467289 4651 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467304 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467316 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467327 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467342 4651 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467354 4651 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467365 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467378 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467393 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467404 4651 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467415 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467430 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467441 4651 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467453 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467465 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467480 4651 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467491 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467504 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467515 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467530 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467542 4651 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467563 4651 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467578 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467589 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467604 4651 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467616 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467630 4651 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467641 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467652 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467664 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467680 4651 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467692 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467705 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467717 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467733 4651 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467744 4651 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467757 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467773 4651 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467787 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467800 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467811 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467827 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467840 4651 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467854 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467865 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467881 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467894 4651 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467906 4651 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467919 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467936 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467949 4651 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467962 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467980 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.467992 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468005 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468018 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468052 4651 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468066 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468079 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468090 4651 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468105 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468116 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468127 4651 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468141 4651 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468153 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468164 4651 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468175 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468189 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468201 4651 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468211 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468222 4651 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468235 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468247 4651 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468259 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468271 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468284 4651 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468295 4651 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468308 4651 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468321 4651 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468332 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468344 4651 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468354 4651 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468367 4651 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468384 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468395 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468406 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468420 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468430 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468440 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468452 4651 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468468 4651 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468478 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468490 4651 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468505 4651 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468515 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468525 4651 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468547 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468561 4651 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468572 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468583 4651 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468594 4651 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468609 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468620 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468631 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468644 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468655 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468666 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468679 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468693 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468703 4651 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468713 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468726 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468741 4651 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468753 4651 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468762 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468780 4651 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468791 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468803 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468814 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468829 4651 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468839 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468849 4651 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468859 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468872 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468883 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468894 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468906 4651 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468918 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468928 4651 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468938 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468952 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468965 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468977 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.468987 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469001 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469012 4651 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469023 4651 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469087 4651 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469104 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469115 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469125 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469136 4651 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469151 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469161 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469172 4651 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469187 4651 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469198 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469210 4651 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469222 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469237 4651 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469248 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469260 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469271 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469286 4651 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469298 4651 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469309 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469323 4651 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.469375 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.470125 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.472753 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.476840 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.477881 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.486010 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.489742 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.496078 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.513111 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.514888 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336"} Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.515212 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.528646 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.528716 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.539664 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.549242 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.559366 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.568264 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.570598 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.570633 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.570649 4651 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.579236 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.653213 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 14:50:54 crc kubenswrapper[4651]: W1126 14:50:54.668240 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-9ea877b9cf842ce22a40b8dddac2701ee3150317a5b732120eefe597d5829ba5 WatchSource:0}: Error finding container 9ea877b9cf842ce22a40b8dddac2701ee3150317a5b732120eefe597d5829ba5: Status 404 returned error can't find the container with id 9ea877b9cf842ce22a40b8dddac2701ee3150317a5b732120eefe597d5829ba5 Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.678815 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 14:50:54 crc kubenswrapper[4651]: W1126 14:50:54.689711 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1a26ed517223f5c839f58adfdfe9e1ea9ab8a7969825dff19429ebdb2893ebf7 WatchSource:0}: Error finding container 1a26ed517223f5c839f58adfdfe9e1ea9ab8a7969825dff19429ebdb2893ebf7: Status 404 returned error can't find the container with id 1a26ed517223f5c839f58adfdfe9e1ea9ab8a7969825dff19429ebdb2893ebf7 Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.722857 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 14:50:54 crc kubenswrapper[4651]: W1126 14:50:54.751601 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f3ebe2c5810d7ee0e0e99ae4fe071d21f4761311ffb444d7d6d69006a87b809c WatchSource:0}: Error finding container f3ebe2c5810d7ee0e0e99ae4fe071d21f4761311ffb444d7d6d69006a87b809c: Status 404 returned error can't find the container with id f3ebe2c5810d7ee0e0e99ae4fe071d21f4761311ffb444d7d6d69006a87b809c Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.964901 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.971723 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.972924 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.972977 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.973014 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.973059 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.973080 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973224 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973263 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973224 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973276 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973290 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:55.973272513 +0000 UTC m=+23.399020117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973349 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973359 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:55.973343935 +0000 UTC m=+23.399091539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973364 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973375 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:50:55.973366136 +0000 UTC m=+23.399113730 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973396 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:55.973384616 +0000 UTC m=+23.399132280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973334 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973418 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: E1126 14:50:54.973445 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:55.973436828 +0000 UTC m=+23.399184522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.975329 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.981198 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:54 crc kubenswrapper[4651]: I1126 14:50:54.991000 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.000326 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.016006 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b5ceb0-b9c6-412e-ab66-35eb5612345d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T14:50:52Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1126 14:50:37.411701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 14:50:37.412313 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913773453/tls.crt::/tmp/serving-cert-1913773453/tls.key\\\\\\\"\\\\nI1126 14:50:52.937219 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 14:50:52.942362 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 14:50:52.942386 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 14:50:52.942416 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 14:50:52.942423 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 14:50:52.948567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 14:50:52.948593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948599 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 14:50:52.948609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 14:50:52.948613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 14:50:52.948617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 14:50:52.948805 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 14:50:52.951090 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.027494 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.036094 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.047387 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.061268 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.098982 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.115602 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.135796 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b5ceb0-b9c6-412e-ab66-35eb5612345d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T14:50:52Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1126 14:50:37.411701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 14:50:37.412313 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913773453/tls.crt::/tmp/serving-cert-1913773453/tls.key\\\\\\\"\\\\nI1126 14:50:52.937219 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 14:50:52.942362 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 14:50:52.942386 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 14:50:52.942416 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 14:50:52.942423 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 14:50:52.948567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 14:50:52.948593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948599 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 14:50:52.948609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 14:50:52.948613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 14:50:52.948617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 14:50:52.948805 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 14:50:52.951090 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.144908 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.153475 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.165532 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00e99124-d57f-4e6c-bb8a-46edab27b557\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5206d7f29dc3a389a62826de4a97864454278e6c3623dcde0fd418a4151e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e1c55f35d59813dd0af255b2aba285aaf9feaf7f8db921afd04e0bb6bf8757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923210e4f2e95b5f2fb2fc0aa190ba52f4e2aa4029de177c68553a79b064b28b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.176317 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.401358 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.401358 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.401643 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.401535 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.405014 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.405720 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.406780 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.407656 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.408468 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.409190 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.409983 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.410734 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.411579 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.413363 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.414065 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.415014 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.415609 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.416212 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.416807 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.418313 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.419135 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.420280 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.421076 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.421807 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.422989 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.423699 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.424815 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.425689 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.426262 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.427601 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.428913 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.429561 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.430345 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.431410 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.432010 4651 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.432158 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.435169 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.435937 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.436742 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.438805 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.439977 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.440539 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.441574 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.442260 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.443142 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.443766 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.444820 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.445813 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.446318 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.447263 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.447819 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.449156 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.449708 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.450295 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.451198 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.451733 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.452802 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.453347 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.538976 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cf4717ec8f77d08ee0e3554a90d4a310e60e3c7d387c96a4710abd5a54ee2c99"} Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.539974 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d7ccb89a6cc84d86563e260626fbce1fe0bc616172e246a1164173157d171549"} Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.540596 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1a26ed517223f5c839f58adfdfe9e1ea9ab8a7969825dff19429ebdb2893ebf7"} Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.541712 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fc7a32032c122009a9371bf4b3d8f99574a8fbeb8672c6e02219c3ec711923c7"} Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.541777 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9ea877b9cf842ce22a40b8dddac2701ee3150317a5b732120eefe597d5829ba5"} Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.542494 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f3ebe2c5810d7ee0e0e99ae4fe071d21f4761311ffb444d7d6d69006a87b809c"} Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.551295 4651 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.561291 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00e99124-d57f-4e6c-bb8a-46edab27b557\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5206d7f29dc3a389a62826de4a97864454278e6c3623dcde0fd418a4151e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e1c55f35d59813dd0af255b2aba285aaf9feaf7f8db921afd04e0bb6bf8757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923210e4f2e95b5f2fb2fc0aa190ba52f4e2aa4029de177c68553a79b064b28b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.577685 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.591531 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.603064 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.617643 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b5ceb0-b9c6-412e-ab66-35eb5612345d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T14:50:52Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1126 14:50:37.411701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 14:50:37.412313 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913773453/tls.crt::/tmp/serving-cert-1913773453/tls.key\\\\\\\"\\\\nI1126 14:50:52.937219 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 14:50:52.942362 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 14:50:52.942386 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 14:50:52.942416 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 14:50:52.942423 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 14:50:52.948567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 14:50:52.948593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948599 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 14:50:52.948609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 14:50:52.948613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 14:50:52.948617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 14:50:52.948805 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 14:50:52.951090 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.699792 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4717ec8f77d08ee0e3554a90d4a310e60e3c7d387c96a4710abd5a54ee2c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ccb89a6cc84d86563e260626fbce1fe0bc616172e246a1164173157d171549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.723410 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.736381 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.750612 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b5ceb0-b9c6-412e-ab66-35eb5612345d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T14:50:52Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1126 14:50:37.411701 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 14:50:37.412313 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1913773453/tls.crt::/tmp/serving-cert-1913773453/tls.key\\\\\\\"\\\\nI1126 14:50:52.937219 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 14:50:52.942362 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 14:50:52.942386 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 14:50:52.942416 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 14:50:52.942423 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 14:50:52.948567 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 14:50:52.948593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948599 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 14:50:52.948605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 14:50:52.948609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 14:50:52.948613 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 14:50:52.948617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 14:50:52.948805 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 14:50:52.951090 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T14:50:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.763120 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4717ec8f77d08ee0e3554a90d4a310e60e3c7d387c96a4710abd5a54ee2c99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ccb89a6cc84d86563e260626fbce1fe0bc616172e246a1164173157d171549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.776710 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.788085 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.806507 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00e99124-d57f-4e6c-bb8a-46edab27b557\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5206d7f29dc3a389a62826de4a97864454278e6c3623dcde0fd418a4151e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e1c55f35d59813dd0af255b2aba285aaf9feaf7f8db921afd04e0bb6bf8757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923210e4f2e95b5f2fb2fc0aa190ba52f4e2aa4029de177c68553a79b064b28b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.824682 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc7a32032c122009a9371bf4b3d8f99574a8fbeb8672c6e02219c3ec711923c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.838128 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.851379 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:55Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.980458 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.980554 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.980595 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980661 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:50:57.980620895 +0000 UTC m=+25.406368539 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980703 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980737 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980759 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.980755 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980772 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980775 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:57.980755009 +0000 UTC m=+25.406502733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980827 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: I1126 14:50:55.980956 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.980985 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:57.980972524 +0000 UTC m=+25.406720238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.981027 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:57.981011215 +0000 UTC m=+25.406758919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.981210 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.981247 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.981270 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:55 crc kubenswrapper[4651]: E1126 14:50:55.981353 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:57.981328714 +0000 UTC m=+25.407076368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:56 crc kubenswrapper[4651]: I1126 14:50:56.401230 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:56 crc kubenswrapper[4651]: E1126 14:50:56.401358 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.401545 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:57 crc kubenswrapper[4651]: E1126 14:50:57.401665 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.401548 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:57 crc kubenswrapper[4651]: E1126 14:50:57.402089 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.549474 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7ad76691cdd8631e1796e005ac3b13ae16cfba44411ea37277a10b4d8a076745"} Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.562623 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00e99124-d57f-4e6c-bb8a-46edab27b557\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5206d7f29dc3a389a62826de4a97864454278e6c3623dcde0fd418a4151e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e1c55f35d59813dd0af255b2aba285aaf9feaf7f8db921afd04e0bb6bf8757\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923210e4f2e95b5f2fb2fc0aa190ba52f4e2aa4029de177c68553a79b064b28b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T14:50:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:57Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.575460 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc7a32032c122009a9371bf4b3d8f99574a8fbeb8672c6e02219c3ec711923c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:57Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.587018 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:57Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.600466 4651 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T14:50:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ad76691cdd8631e1796e005ac3b13ae16cfba44411ea37277a10b4d8a076745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T14:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T14:50:57Z is after 2025-08-24T17:21:41Z" Nov 26 14:50:57 crc kubenswrapper[4651]: I1126 14:50:57.632118 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.632094595 podStartE2EDuration="3.632094595s" podCreationTimestamp="2025-11-26 14:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:57.632013903 +0000 UTC m=+25.057761527" watchObservedRunningTime="2025-11-26 14:50:57.632094595 +0000 UTC m=+25.057842209" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.001575 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.001644 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.001679 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.001702 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.001728 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.001801 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.001856 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.001838386 +0000 UTC m=+29.427585990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002167 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.002139113 +0000 UTC m=+29.427886717 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002168 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002238 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.002227866 +0000 UTC m=+29.427975560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002237 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002269 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002281 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002331 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.002316438 +0000 UTC m=+29.428064042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002178 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002359 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002367 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.002394 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.00238819 +0000 UTC m=+29.428135794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.401486 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.401624 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.462871 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gjjss"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.463208 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cm2nk"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.463371 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.463858 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.468877 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.469244 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.469326 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.469344 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.469374 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.469561 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.470584 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.494149 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=4.494131845 podStartE2EDuration="4.494131845s" podCreationTimestamp="2025-11-26 14:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:58.493229872 +0000 UTC m=+25.918977486" watchObservedRunningTime="2025-11-26 14:50:58.494131845 +0000 UTC m=+25.919879449" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.506401 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becf773b-be6a-440a-9637-860572d65926-serviceca\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.506466 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4brd\" (UniqueName: \"kubernetes.io/projected/190e446e-c80b-460f-a64f-93cd87a6d4a8-kube-api-access-d4brd\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.506490 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/190e446e-c80b-460f-a64f-93cd87a6d4a8-hosts-file\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.506509 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becf773b-be6a-440a-9637-860572d65926-host\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.506549 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6dr\" (UniqueName: \"kubernetes.io/projected/becf773b-be6a-440a-9637-860572d65926-kube-api-access-4j6dr\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.556724 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-99mrs"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.557172 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.558957 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.561936 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.562525 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.563394 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.563647 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.606700 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-c884v"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607052 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607270 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1233982f-5a21-4fdd-98e0-e11b5cedc385-rootfs\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607446 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4brd\" (UniqueName: \"kubernetes.io/projected/190e446e-c80b-460f-a64f-93cd87a6d4a8-kube-api-access-d4brd\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607625 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/190e446e-c80b-460f-a64f-93cd87a6d4a8-hosts-file\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607703 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becf773b-be6a-440a-9637-860572d65926-host\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607773 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1233982f-5a21-4fdd-98e0-e11b5cedc385-proxy-tls\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607845 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdwbz\" (UniqueName: \"kubernetes.io/projected/1233982f-5a21-4fdd-98e0-e11b5cedc385-kube-api-access-pdwbz\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607981 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/190e446e-c80b-460f-a64f-93cd87a6d4a8-hosts-file\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.607943 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becf773b-be6a-440a-9637-860572d65926-host\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.608195 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6dr\" (UniqueName: \"kubernetes.io/projected/becf773b-be6a-440a-9637-860572d65926-kube-api-access-4j6dr\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.608311 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becf773b-be6a-440a-9637-860572d65926-serviceca\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.608436 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1233982f-5a21-4fdd-98e0-e11b5cedc385-mcd-auth-proxy-config\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.609202 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qfclp"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.609384 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becf773b-be6a-440a-9637-860572d65926-serviceca\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.609844 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.611601 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.612049 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.613735 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.614223 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.615714 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.615737 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.615781 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.639463 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4brd\" (UniqueName: \"kubernetes.io/projected/190e446e-c80b-460f-a64f-93cd87a6d4a8-kube-api-access-d4brd\") pod \"node-resolver-gjjss\" (UID: \"190e446e-c80b-460f-a64f-93cd87a6d4a8\") " pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.645122 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6dr\" (UniqueName: \"kubernetes.io/projected/becf773b-be6a-440a-9637-860572d65926-kube-api-access-4j6dr\") pod \"node-ca-cm2nk\" (UID: \"becf773b-be6a-440a-9637-860572d65926\") " pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.705410 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-79fzh"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.705913 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.705982 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709235 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1233982f-5a21-4fdd-98e0-e11b5cedc385-proxy-tls\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709274 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-cni-binary-copy\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709293 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-system-cni-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709318 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-netns\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709344 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-bin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709358 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxt8\" (UniqueName: \"kubernetes.io/projected/88feea33-aa22-45e0-9066-e40e92590ca5-kube-api-access-dgxt8\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709376 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1233982f-5a21-4fdd-98e0-e11b5cedc385-mcd-auth-proxy-config\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709391 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709404 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-cnibin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709417 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-etc-kubernetes\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709431 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-cnibin\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709448 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-hostroot\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709620 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-multus-daemon-config\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709664 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-k8s-cni-cncf-io\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709691 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kcc2\" (UniqueName: \"kubernetes.io/projected/1d7e0f33-2db2-430b-8991-9beb97979488-kube-api-access-5kcc2\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709718 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdwbz\" (UniqueName: \"kubernetes.io/projected/1233982f-5a21-4fdd-98e0-e11b5cedc385-kube-api-access-pdwbz\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709761 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709849 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-system-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709892 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-os-release\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709917 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-binary-copy\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709949 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-socket-dir-parent\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.709971 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710013 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1233982f-5a21-4fdd-98e0-e11b5cedc385-rootfs\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710052 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-conf-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710048 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1233982f-5a21-4fdd-98e0-e11b5cedc385-mcd-auth-proxy-config\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710074 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1233982f-5a21-4fdd-98e0-e11b5cedc385-rootfs\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710095 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-multus\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710123 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-kubelet\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710143 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-multus-certs\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.710163 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-os-release\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.718278 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1233982f-5a21-4fdd-98e0-e11b5cedc385-proxy-tls\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.736920 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdwbz\" (UniqueName: \"kubernetes.io/projected/1233982f-5a21-4fdd-98e0-e11b5cedc385-kube-api-access-pdwbz\") pod \"machine-config-daemon-99mrs\" (UID: \"1233982f-5a21-4fdd-98e0-e11b5cedc385\") " pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.777861 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjjss" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.785265 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm2nk" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.794893 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mmgnh"] Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.795861 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.800895 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.801778 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.803251 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.803297 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.803510 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.804550 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.807502 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810565 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-multus\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810607 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-kubelet\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810631 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-multus-certs\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810651 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-os-release\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810672 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-system-cni-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810720 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-cni-binary-copy\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810743 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-netns\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810764 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-bin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810785 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxt8\" (UniqueName: \"kubernetes.io/projected/88feea33-aa22-45e0-9066-e40e92590ca5-kube-api-access-dgxt8\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810813 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810848 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810871 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-cnibin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810892 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-etc-kubernetes\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810913 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-cnibin\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810936 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-hostroot\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810966 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-multus-daemon-config\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.810990 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-k8s-cni-cncf-io\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811013 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kcc2\" (UniqueName: \"kubernetes.io/projected/1d7e0f33-2db2-430b-8991-9beb97979488-kube-api-access-5kcc2\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811035 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811059 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-system-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811097 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-os-release\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811117 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-binary-copy\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811141 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzctv\" (UniqueName: \"kubernetes.io/projected/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-kube-api-access-hzctv\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811163 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-socket-dir-parent\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811186 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811230 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-conf-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811285 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-conf-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811318 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-multus\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811323 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-kubelet\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811364 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-multus-certs\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811627 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-os-release\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.811673 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-system-cni-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812009 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812095 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-cnibin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812135 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-etc-kubernetes\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812167 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-cnibin\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812196 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-hostroot\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812279 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-cni-binary-copy\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812322 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-netns\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812353 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-var-lib-cni-bin\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812416 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-os-release\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812458 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-host-run-k8s-cni-cncf-io\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812815 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.812905 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-system-cni-dir\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.813458 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d7e0f33-2db2-430b-8991-9beb97979488-cni-binary-copy\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.813914 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/88feea33-aa22-45e0-9066-e40e92590ca5-multus-socket-dir-parent\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.814505 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d7e0f33-2db2-430b-8991-9beb97979488-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.814603 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/88feea33-aa22-45e0-9066-e40e92590ca5-multus-daemon-config\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.835346 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kcc2\" (UniqueName: \"kubernetes.io/projected/1d7e0f33-2db2-430b-8991-9beb97979488-kube-api-access-5kcc2\") pod \"multus-additional-cni-plugins-qfclp\" (UID: \"1d7e0f33-2db2-430b-8991-9beb97979488\") " pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.835490 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxt8\" (UniqueName: \"kubernetes.io/projected/88feea33-aa22-45e0-9066-e40e92590ca5-kube-api-access-dgxt8\") pod \"multus-c884v\" (UID: \"88feea33-aa22-45e0-9066-e40e92590ca5\") " pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.868940 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:50:58 crc kubenswrapper[4651]: W1126 14:50:58.882729 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1233982f_5a21_4fdd_98e0_e11b5cedc385.slice/crio-eb39d4234e001d5fe4883c42114e4a0d138d23bccfe6ede5c8a7e54f3f694802 WatchSource:0}: Error finding container eb39d4234e001d5fe4883c42114e4a0d138d23bccfe6ede5c8a7e54f3f694802: Status 404 returned error can't find the container with id eb39d4234e001d5fe4883c42114e4a0d138d23bccfe6ede5c8a7e54f3f694802 Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913130 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913172 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913195 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913231 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913252 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913274 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913296 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913317 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nm9k\" (UniqueName: \"kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913342 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzctv\" (UniqueName: \"kubernetes.io/projected/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-kube-api-access-hzctv\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913364 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913385 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913409 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913432 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913463 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913493 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913513 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913552 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913575 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913596 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913616 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913634 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.913655 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.914153 4651 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: E1126 14:50:58.914220 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs podName:46f059e4-ddf4-4e21-b528-0cc9cec8afa1 nodeName:}" failed. No retries permitted until 2025-11-26 14:50:59.414202495 +0000 UTC m=+26.839950099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs") pod "network-metrics-daemon-79fzh" (UID: "46f059e4-ddf4-4e21-b528-0cc9cec8afa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.919076 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c884v" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.924683 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qfclp" Nov 26 14:50:58 crc kubenswrapper[4651]: I1126 14:50:58.940546 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzctv\" (UniqueName: \"kubernetes.io/projected/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-kube-api-access-hzctv\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:58 crc kubenswrapper[4651]: W1126 14:50:58.951870 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88feea33_aa22_45e0_9066_e40e92590ca5.slice/crio-290772b7c50500cffbedcec3e44858cc982b80819e27cafc1026b81bfce21cdc WatchSource:0}: Error finding container 290772b7c50500cffbedcec3e44858cc982b80819e27cafc1026b81bfce21cdc: Status 404 returned error can't find the container with id 290772b7c50500cffbedcec3e44858cc982b80819e27cafc1026b81bfce21cdc Nov 26 14:50:58 crc kubenswrapper[4651]: W1126 14:50:58.997956 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d7e0f33_2db2_430b_8991_9beb97979488.slice/crio-118676cb5e678e5dd56b45b0c037103b518ffe700755656498953b1941954fbe WatchSource:0}: Error finding container 118676cb5e678e5dd56b45b0c037103b518ffe700755656498953b1941954fbe: Status 404 returned error can't find the container with id 118676cb5e678e5dd56b45b0c037103b518ffe700755656498953b1941954fbe Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014602 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014636 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014655 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014671 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014686 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014701 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014715 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014729 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014752 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014769 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014782 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014799 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014814 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nm9k\" (UniqueName: \"kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014828 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014845 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014861 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014879 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014899 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014922 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.014936 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.015449 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.015498 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.015522 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.015545 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.015956 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.018461 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.018507 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.018533 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.018991 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019033 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019097 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019136 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019175 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019416 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019461 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019485 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019507 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019528 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.019549 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.035775 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nm9k\" (UniqueName: \"kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k\") pod \"ovnkube-node-mmgnh\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.123978 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.291882 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5"] Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.292313 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.297285 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.297391 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.336772 4651 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.338483 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.338624 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.338708 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.338881 4651 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.385143 4651 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.385604 4651 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.386913 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.386960 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.386973 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.386991 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.387002 4651 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T14:50:59Z","lastTransitionTime":"2025-11-26T14:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.401957 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.402043 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:50:59 crc kubenswrapper[4651]: E1126 14:50:59.402085 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:50:59 crc kubenswrapper[4651]: E1126 14:50:59.402182 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.419877 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.420238 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.420289 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60244634-2838-4dc3-81c1-ca3e682e6f86-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.420310 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:50:59 crc kubenswrapper[4651]: E1126 14:50:59.420400 4651 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:50:59 crc kubenswrapper[4651]: E1126 14:50:59.420450 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs podName:46f059e4-ddf4-4e21-b528-0cc9cec8afa1 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:00.420436247 +0000 UTC m=+27.846183851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs") pod "network-metrics-daemon-79fzh" (UID: "46f059e4-ddf4-4e21-b528-0cc9cec8afa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.420515 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjtq\" (UniqueName: \"kubernetes.io/projected/60244634-2838-4dc3-81c1-ca3e682e6f86-kube-api-access-lmjtq\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.461943 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.461980 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.461991 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.462006 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.462015 4651 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T14:50:59Z","lastTransitionTime":"2025-11-26T14:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.521911 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.521975 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.522005 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60244634-2838-4dc3-81c1-ca3e682e6f86-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.522070 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjtq\" (UniqueName: \"kubernetes.io/projected/60244634-2838-4dc3-81c1-ca3e682e6f86-kube-api-access-lmjtq\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.522649 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.522824 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60244634-2838-4dc3-81c1-ca3e682e6f86-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.525081 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60244634-2838-4dc3-81c1-ca3e682e6f86-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.539459 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjtq\" (UniqueName: \"kubernetes.io/projected/60244634-2838-4dc3-81c1-ca3e682e6f86-kube-api-access-lmjtq\") pod \"ovnkube-control-plane-749d76644c-xmsh5\" (UID: \"60244634-2838-4dc3-81c1-ca3e682e6f86\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.554833 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" exitCode=0 Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.554904 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.554933 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"640be5f08b7fc2d1c69bf3c5824b5f712a8025db0e667a4024c8006add07c847"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.558385 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c884v" event={"ID":"88feea33-aa22-45e0-9066-e40e92590ca5","Type":"ContainerStarted","Data":"43ad7c819f1d281b57f0f77e053e23c095780a8db7e169d72abc767a922fcfd8"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.558437 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c884v" event={"ID":"88feea33-aa22-45e0-9066-e40e92590ca5","Type":"ContainerStarted","Data":"290772b7c50500cffbedcec3e44858cc982b80819e27cafc1026b81bfce21cdc"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.562800 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerStarted","Data":"77f6faa13bedea8bbfa257f1ea0c448528a2b84d7766d08ad78e358e18717a75"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.562843 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerStarted","Data":"118676cb5e678e5dd56b45b0c037103b518ffe700755656498953b1941954fbe"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.564721 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjjss" event={"ID":"190e446e-c80b-460f-a64f-93cd87a6d4a8","Type":"ContainerStarted","Data":"eea3913bcd63ff0375805bcd61ddf10984ed9ced021f5995be2e5386189950b8"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.564767 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjjss" event={"ID":"190e446e-c80b-460f-a64f-93cd87a6d4a8","Type":"ContainerStarted","Data":"049a804bca16dddc83eeb60fd38ab717703c4c0e462ea6ce1d3023f844fa7c1a"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.566635 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"dc05fc144a73b4613aeb5b93e6d0ed715fe02184d6d0d15174f5965b9798a2f1"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.566669 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.566714 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"eb39d4234e001d5fe4883c42114e4a0d138d23bccfe6ede5c8a7e54f3f694802"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.567600 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm2nk" event={"ID":"becf773b-be6a-440a-9637-860572d65926","Type":"ContainerStarted","Data":"1524056213bd3e22235ce52b57d15c1e7b5325cc0fb630f909f4c4f7d2c8b229"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.567653 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm2nk" event={"ID":"becf773b-be6a-440a-9637-860572d65926","Type":"ContainerStarted","Data":"e1843f4fb536ab1b93133f47a9f5cc0927b15d6caa8fb435e92a90480e920583"} Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.607071 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" Nov 26 14:50:59 crc kubenswrapper[4651]: W1126 14:50:59.621462 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60244634_2838_4dc3_81c1_ca3e682e6f86.slice/crio-65469d639f58d58368081d09b41fb0da0c2d816e035aa88a2a9759ee3cb0b9f9 WatchSource:0}: Error finding container 65469d639f58d58368081d09b41fb0da0c2d816e035aa88a2a9759ee3cb0b9f9: Status 404 returned error can't find the container with id 65469d639f58d58368081d09b41fb0da0c2d816e035aa88a2a9759ee3cb0b9f9 Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.633648 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cm2nk" podStartSLOduration=1.633630504 podStartE2EDuration="1.633630504s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:59.61463632 +0000 UTC m=+27.040383924" watchObservedRunningTime="2025-11-26 14:50:59.633630504 +0000 UTC m=+27.059378118" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.634174 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podStartSLOduration=1.634168078 podStartE2EDuration="1.634168078s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:59.633090051 +0000 UTC m=+27.058837665" watchObservedRunningTime="2025-11-26 14:50:59.634168078 +0000 UTC m=+27.059915692" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.657814 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gjjss" podStartSLOduration=1.657784302 podStartE2EDuration="1.657784302s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:59.656782477 +0000 UTC m=+27.082530101" watchObservedRunningTime="2025-11-26 14:50:59.657784302 +0000 UTC m=+27.083531906" Nov 26 14:50:59 crc kubenswrapper[4651]: I1126 14:50:59.697706 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c884v" podStartSLOduration=1.697684371 podStartE2EDuration="1.697684371s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:50:59.677389692 +0000 UTC m=+27.103137316" watchObservedRunningTime="2025-11-26 14:50:59.697684371 +0000 UTC m=+27.123431975" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.121409 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm"] Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.121808 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.126257 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.126994 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.127519 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.127571 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.230421 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d770f8e5-3f25-4e99-a859-9b109e9824b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.230668 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.230803 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.230942 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d770f8e5-3f25-4e99-a859-9b109e9824b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.231015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d770f8e5-3f25-4e99-a859-9b109e9824b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.331861 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d770f8e5-3f25-4e99-a859-9b109e9824b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.331972 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332013 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332079 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d770f8e5-3f25-4e99-a859-9b109e9824b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332104 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d770f8e5-3f25-4e99-a859-9b109e9824b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332127 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332206 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d770f8e5-3f25-4e99-a859-9b109e9824b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.332673 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d770f8e5-3f25-4e99-a859-9b109e9824b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.337143 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d770f8e5-3f25-4e99-a859-9b109e9824b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.349895 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d770f8e5-3f25-4e99-a859-9b109e9824b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jrqgm\" (UID: \"d770f8e5-3f25-4e99-a859-9b109e9824b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.401228 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.401249 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:00 crc kubenswrapper[4651]: E1126 14:51:00.401378 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:00 crc kubenswrapper[4651]: E1126 14:51:00.401483 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.433465 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:00 crc kubenswrapper[4651]: E1126 14:51:00.433638 4651 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:00 crc kubenswrapper[4651]: E1126 14:51:00.433743 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs podName:46f059e4-ddf4-4e21-b528-0cc9cec8afa1 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:02.433717422 +0000 UTC m=+29.859465036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs") pod "network-metrics-daemon-79fzh" (UID: "46f059e4-ddf4-4e21-b528-0cc9cec8afa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.434971 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.579906 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" event={"ID":"d770f8e5-3f25-4e99-a859-9b109e9824b8","Type":"ContainerStarted","Data":"c1b81cc97953c866d1792042ddfd6381bfda5fc1d7e136a2e52aa03bc2a72a57"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.584167 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" event={"ID":"60244634-2838-4dc3-81c1-ca3e682e6f86","Type":"ContainerStarted","Data":"1494516572ba56872e2c856849b2d93c042d0ba55b323228bbeddedc891e483b"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.584201 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" event={"ID":"60244634-2838-4dc3-81c1-ca3e682e6f86","Type":"ContainerStarted","Data":"18f8c35f6944aeb5c8553a44578458fb11b03c69457426ceacc97fdab2088a62"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.584217 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" event={"ID":"60244634-2838-4dc3-81c1-ca3e682e6f86","Type":"ContainerStarted","Data":"65469d639f58d58368081d09b41fb0da0c2d816e035aa88a2a9759ee3cb0b9f9"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589850 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589893 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589906 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589918 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589930 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.589941 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.593963 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="77f6faa13bedea8bbfa257f1ea0c448528a2b84d7766d08ad78e358e18717a75" exitCode=0 Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.594020 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"77f6faa13bedea8bbfa257f1ea0c448528a2b84d7766d08ad78e358e18717a75"} Nov 26 14:51:00 crc kubenswrapper[4651]: I1126 14:51:00.599091 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xmsh5" podStartSLOduration=1.599079634 podStartE2EDuration="1.599079634s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:00.598618772 +0000 UTC m=+28.024366376" watchObservedRunningTime="2025-11-26 14:51:00.599079634 +0000 UTC m=+28.024827238" Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.918787 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.919108 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:01 crc kubenswrapper[4651]: E1126 14:51:01.919160 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:01 crc kubenswrapper[4651]: E1126 14:51:01.919318 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.918818 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.918785 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:01 crc kubenswrapper[4651]: E1126 14:51:01.919489 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:51:01 crc kubenswrapper[4651]: E1126 14:51:01.919540 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.929803 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" event={"ID":"d770f8e5-3f25-4e99-a859-9b109e9824b8","Type":"ContainerStarted","Data":"d6d085064c67c1fcb11eb6500cba450621f8d7b05fb3ad1c2e0867647ed0a915"} Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.932791 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerStarted","Data":"e32e8f73a218f270709a1a67a58ada60109b10a4f73b80411f68326e37856621"} Nov 26 14:51:01 crc kubenswrapper[4651]: I1126 14:51:01.973411 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jrqgm" podStartSLOduration=3.973381113 podStartE2EDuration="3.973381113s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:01.946332239 +0000 UTC m=+29.372079833" watchObservedRunningTime="2025-11-26 14:51:01.973381113 +0000 UTC m=+29.399128767" Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.020786 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.020932 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.020975 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.021003 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.021083 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021220 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021239 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021253 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021306 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:10.02128973 +0000 UTC m=+37.447037344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021642 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021682 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021702 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021716 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021751 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:10.021727651 +0000 UTC m=+37.447475305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021774 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:10.021766192 +0000 UTC m=+37.447513926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.021867 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:10.021858994 +0000 UTC m=+37.447606598 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.022089 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.022297 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:10.022262565 +0000 UTC m=+37.448010379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.524865 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.525136 4651 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: E1126 14:51:02.525296 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs podName:46f059e4-ddf4-4e21-b528-0cc9cec8afa1 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:06.525276463 +0000 UTC m=+33.951024087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs") pod "network-metrics-daemon-79fzh" (UID: "46f059e4-ddf4-4e21-b528-0cc9cec8afa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:02 crc kubenswrapper[4651]: I1126 14:51:02.939218 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.402682 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.402869 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.404652 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:03 crc kubenswrapper[4651]: E1126 14:51:03.404648 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.404675 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:03 crc kubenswrapper[4651]: E1126 14:51:03.404818 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:03 crc kubenswrapper[4651]: E1126 14:51:03.404930 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:03 crc kubenswrapper[4651]: E1126 14:51:03.404987 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.944707 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="e32e8f73a218f270709a1a67a58ada60109b10a4f73b80411f68326e37856621" exitCode=0 Nov 26 14:51:03 crc kubenswrapper[4651]: I1126 14:51:03.944750 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"e32e8f73a218f270709a1a67a58ada60109b10a4f73b80411f68326e37856621"} Nov 26 14:51:04 crc kubenswrapper[4651]: I1126 14:51:04.950662 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="27dc8325be14e88e3120e4e2667ae89c5fb04640d9a800f9558076be8091c6aa" exitCode=0 Nov 26 14:51:04 crc kubenswrapper[4651]: I1126 14:51:04.950969 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"27dc8325be14e88e3120e4e2667ae89c5fb04640d9a800f9558076be8091c6aa"} Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.401379 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.401434 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:05 crc kubenswrapper[4651]: E1126 14:51:05.401522 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.401563 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:05 crc kubenswrapper[4651]: E1126 14:51:05.401606 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.401559 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:05 crc kubenswrapper[4651]: E1126 14:51:05.401732 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:51:05 crc kubenswrapper[4651]: E1126 14:51:05.401768 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.958258 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerStarted","Data":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.959514 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.959641 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.962507 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="2df8cb81f3ab6763feeb81871d68b3130506db38bd419d6ef201e3e0606a1be8" exitCode=0 Nov 26 14:51:05 crc kubenswrapper[4651]: I1126 14:51:05.962642 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"2df8cb81f3ab6763feeb81871d68b3130506db38bd419d6ef201e3e0606a1be8"} Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.000734 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.001167 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.007132 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podStartSLOduration=8.007109688 podStartE2EDuration="8.007109688s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:06.00566119 +0000 UTC m=+33.431408814" watchObservedRunningTime="2025-11-26 14:51:06.007109688 +0000 UTC m=+33.432857342" Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.532261 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.560299 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:06 crc kubenswrapper[4651]: E1126 14:51:06.560434 4651 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:06 crc kubenswrapper[4651]: E1126 14:51:06.560520 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs podName:46f059e4-ddf4-4e21-b528-0cc9cec8afa1 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.560501197 +0000 UTC m=+41.986248801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs") pod "network-metrics-daemon-79fzh" (UID: "46f059e4-ddf4-4e21-b528-0cc9cec8afa1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.980975 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 14:51:06 crc kubenswrapper[4651]: I1126 14:51:06.980980 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerStarted","Data":"c3d16428bdc2e6cddac145d8473762182bd73f346369f27c26b737c172f756d3"} Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.361080 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-79fzh"] Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.361195 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:07 crc kubenswrapper[4651]: E1126 14:51:07.361266 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.402202 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.402257 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:07 crc kubenswrapper[4651]: E1126 14:51:07.402320 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:07 crc kubenswrapper[4651]: E1126 14:51:07.402386 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.402198 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:07 crc kubenswrapper[4651]: E1126 14:51:07.402459 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.986402 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="c3d16428bdc2e6cddac145d8473762182bd73f346369f27c26b737c172f756d3" exitCode=0 Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.986476 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"c3d16428bdc2e6cddac145d8473762182bd73f346369f27c26b737c172f756d3"} Nov 26 14:51:07 crc kubenswrapper[4651]: I1126 14:51:07.986552 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 14:51:08 crc kubenswrapper[4651]: I1126 14:51:08.993096 4651 generic.go:334] "Generic (PLEG): container finished" podID="1d7e0f33-2db2-430b-8991-9beb97979488" containerID="53cb0e8eaa83dc216c30d01111cd4702b740052bac1864a61297f9f3f518466d" exitCode=0 Nov 26 14:51:08 crc kubenswrapper[4651]: I1126 14:51:08.993154 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerDied","Data":"53cb0e8eaa83dc216c30d01111cd4702b740052bac1864a61297f9f3f518466d"} Nov 26 14:51:09 crc kubenswrapper[4651]: I1126 14:51:09.401067 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:09 crc kubenswrapper[4651]: I1126 14:51:09.401071 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:09 crc kubenswrapper[4651]: I1126 14:51:09.401111 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:09 crc kubenswrapper[4651]: E1126 14:51:09.401624 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79fzh" podUID="46f059e4-ddf4-4e21-b528-0cc9cec8afa1" Nov 26 14:51:09 crc kubenswrapper[4651]: E1126 14:51:09.401484 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 14:51:09 crc kubenswrapper[4651]: E1126 14:51:09.401730 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 14:51:09 crc kubenswrapper[4651]: I1126 14:51:09.401149 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:09 crc kubenswrapper[4651]: E1126 14:51:09.401933 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.000816 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qfclp" event={"ID":"1d7e0f33-2db2-430b-8991-9beb97979488","Type":"ContainerStarted","Data":"df3781ebe4e8e0094afdd5e9b56de486bf03f9bcd6a63c16ac128487cce2e049"} Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.026871 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qfclp" podStartSLOduration=12.02685282 podStartE2EDuration="12.02685282s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:10.023432831 +0000 UTC m=+37.449180475" watchObservedRunningTime="2025-11-26 14:51:10.02685282 +0000 UTC m=+37.452600434" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.094576 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.094732 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:26.094672524 +0000 UTC m=+53.520420138 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.094768 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.094828 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.094916 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.094941 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.094948 4651 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095014 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:26.094996933 +0000 UTC m=+53.520744547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095090 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095118 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095134 4651 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095178 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:26.095163397 +0000 UTC m=+53.520911011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095253 4651 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095260 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095282 4651 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095297 4651 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095307 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:26.09529399 +0000 UTC m=+53.521041604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 14:51:10 crc kubenswrapper[4651]: E1126 14:51:10.095341 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 14:51:26.095326251 +0000 UTC m=+53.521073865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.680160 4651 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.680305 4651 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.718300 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.718960 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.723186 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mgxls"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.723808 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.724067 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pm87n"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.724331 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.725548 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.725818 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.727425 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.727744 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.740029 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.757215 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.758018 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.758259 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.759475 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.759590 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.759792 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.759918 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.760021 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.760224 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.760384 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.760954 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761130 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761228 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761312 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761394 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761485 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761574 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761670 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761758 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.761984 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.762125 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.762231 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.762330 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.762441 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.772729 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.772747 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kps8x"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.772774 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.772929 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773060 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773083 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773099 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773154 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773186 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773204 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773698 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-v9zwm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773780 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773817 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.773980 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.775292 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.775422 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6jcgh"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.775752 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.775857 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.776104 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.776113 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.776493 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.777009 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.777295 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.777974 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.778645 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.778742 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.778916 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.781319 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-skjk9"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.781865 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.782284 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.782934 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.784348 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.784815 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.807721 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.808343 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.825715 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.825966 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.826374 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.826719 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.826859 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.826993 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827115 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.826965 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-serving-cert\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827227 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-service-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827267 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827292 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827306 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827319 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9g4l\" (UniqueName: \"kubernetes.io/projected/e69d02a9-477f-4281-bb15-469b21b21f7a-kube-api-access-w9g4l\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827376 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827399 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827419 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fe610b-235d-4252-9199-24c83fb3f457-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827477 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-client\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827500 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-config\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827531 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827552 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b50e8580-b755-4535-9675-a167c40b6278-serving-cert\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827576 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827573 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea68f924-76e2-4a91-82b7-90a3b194c011-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7c2a5af-4204-4822-bec4-8589813d80df-audit-dir\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827719 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827723 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827746 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-serving-cert\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827771 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5tb\" (UniqueName: \"kubernetes.io/projected/9010f7b8-93e2-47e6-ab50-16ca7a9b337d-kube-api-access-cc5tb\") pod \"downloads-7954f5f757-v9zwm\" (UID: \"9010f7b8-93e2-47e6-ab50-16ca7a9b337d\") " pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827791 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827827 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-client\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827847 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7cn\" (UniqueName: \"kubernetes.io/projected/4b814105-58ac-41b6-8b52-efa5de815233-kube-api-access-sh7cn\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827862 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827871 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-node-pullsecrets\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827928 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827946 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827951 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-encryption-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827971 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-audit-dir\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.827992 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbrm7\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-kube-api-access-tbrm7\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.828015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.828588 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829229 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829321 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829368 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829527 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829589 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829681 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.829728 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.830775 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.830840 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.830990 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831016 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831141 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831140 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831220 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831303 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-serving-cert\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831318 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831340 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831369 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831382 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831407 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-serving-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831427 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831449 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/39aef3e6-7314-4d82-8e9c-a83d505e022e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831472 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3416d607-a1be-4dda-9d40-8cd6276002bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831493 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpr2f\" (UniqueName: \"kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831513 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831533 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85btv\" (UniqueName: \"kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831553 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3416d607-a1be-4dda-9d40-8cd6276002bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831579 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831601 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831621 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831641 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-images\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831661 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831682 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-client\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831700 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831721 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmv8\" (UniqueName: \"kubernetes.io/projected/ea68f924-76e2-4a91-82b7-90a3b194c011-kube-api-access-rdmv8\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831740 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-image-import-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831761 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831777 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831782 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcbc\" (UniqueName: \"kubernetes.io/projected/39aef3e6-7314-4d82-8e9c-a83d505e022e-kube-api-access-rqcbc\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831807 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831829 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea68f924-76e2-4a91-82b7-90a3b194c011-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831849 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz62r\" (UniqueName: \"kubernetes.io/projected/56fe610b-235d-4252-9199-24c83fb3f457-kube-api-access-hz62r\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831868 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831900 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-encryption-config\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831946 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831970 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39aef3e6-7314-4d82-8e9c-a83d505e022e-serving-cert\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.831991 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832049 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-audit\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832105 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe610b-235d-4252-9199-24c83fb3f457-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832128 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfrf\" (UniqueName: \"kubernetes.io/projected/e7c2a5af-4204-4822-bec4-8589813d80df-kube-api-access-wlfrf\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832150 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832170 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-audit-policies\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832191 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832210 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832231 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lj6\" (UniqueName: \"kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832256 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-service-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832278 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832299 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832318 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-config\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832337 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b814105-58ac-41b6-8b52-efa5de815233-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832357 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832376 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbp9\" (UniqueName: \"kubernetes.io/projected/b50e8580-b755-4535-9675-a167c40b6278-kube-api-access-dsbp9\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832404 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832441 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhlzz\" (UniqueName: \"kubernetes.io/projected/7efeda68-504a-457c-8576-15a4eb8ffc86-kube-api-access-qhlzz\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832464 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832486 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfd8c\" (UniqueName: \"kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832505 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.832523 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-config\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833299 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833431 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833568 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833690 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833822 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833905 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.833961 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834021 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834093 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834161 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834199 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834274 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834302 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834448 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834642 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834730 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.834862 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.835023 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.835127 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.835849 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836067 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836221 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836246 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6kjm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836824 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836837 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.836963 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.837110 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.837343 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.837570 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.837611 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.837730 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.838146 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.838174 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.839198 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.839315 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.839718 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.841708 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.848548 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.849167 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.849660 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.849987 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.850779 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vw9bz"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.851181 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.854915 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.855428 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.892612 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.894187 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.900321 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.900516 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.901087 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.901744 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.902233 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.903152 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pp9mp"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.924070 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.926718 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.926972 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.927936 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.928422 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.929054 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.929224 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.929397 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.929623 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.933894 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.933957 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-client\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.933993 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934021 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bb6d2ae9-7867-4995-97b5-33740c0de594-machine-approver-tls\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934072 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c80d47-92ec-4861-8936-289e6525a876-config\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934100 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcbc\" (UniqueName: \"kubernetes.io/projected/39aef3e6-7314-4d82-8e9c-a83d505e022e-kube-api-access-rqcbc\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934122 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9v4\" (UniqueName: \"kubernetes.io/projected/bb6d2ae9-7867-4995-97b5-33740c0de594-kube-api-access-rz9v4\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934147 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934171 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmv8\" (UniqueName: \"kubernetes.io/projected/ea68f924-76e2-4a91-82b7-90a3b194c011-kube-api-access-rdmv8\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934193 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-image-import-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934213 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934234 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz62r\" (UniqueName: \"kubernetes.io/projected/56fe610b-235d-4252-9199-24c83fb3f457-kube-api-access-hz62r\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934254 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934290 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934314 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea68f924-76e2-4a91-82b7-90a3b194c011-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934334 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-encryption-config\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934355 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934377 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39aef3e6-7314-4d82-8e9c-a83d505e022e-serving-cert\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934398 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d53b7ba5-49be-4aa3-87d6-89c74221cfda-metrics-tls\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934422 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934455 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934478 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws65s\" (UniqueName: \"kubernetes.io/projected/7856b53a-287e-4c39-9f3f-0f384ecc84fe-kube-api-access-ws65s\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934502 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4kx\" (UniqueName: \"kubernetes.io/projected/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-kube-api-access-xx4kx\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934528 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934550 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-audit\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934572 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe610b-235d-4252-9199-24c83fb3f457-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934594 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934606 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934627 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-audit-policies\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934661 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfrf\" (UniqueName: \"kubernetes.io/projected/e7c2a5af-4204-4822-bec4-8589813d80df-kube-api-access-wlfrf\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934684 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-default-certificate\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934710 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934734 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934756 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lj6\" (UniqueName: \"kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934780 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4clff\" (UniqueName: \"kubernetes.io/projected/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-kube-api-access-4clff\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934806 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-service-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934830 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934851 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbp9\" (UniqueName: \"kubernetes.io/projected/b50e8580-b755-4535-9675-a167c40b6278-kube-api-access-dsbp9\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934876 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934899 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934921 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-config\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934943 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b814105-58ac-41b6-8b52-efa5de815233-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934967 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c80d47-92ec-4861-8936-289e6525a876-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.934989 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97f4b33d-62ab-442b-a11a-2c62f88c3b80-trusted-ca\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935009 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-stats-auth\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935029 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935074 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935114 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhlzz\" (UniqueName: \"kubernetes.io/projected/7efeda68-504a-457c-8576-15a4eb8ffc86-kube-api-access-qhlzz\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935138 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935160 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfd8c\" (UniqueName: \"kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935181 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935203 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-config\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935228 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-serving-cert\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935251 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-service-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935285 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7856b53a-287e-4c39-9f3f-0f384ecc84fe-service-ca-bundle\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935308 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c80d47-92ec-4861-8936-289e6525a876-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935337 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935360 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935384 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9g4l\" (UniqueName: \"kubernetes.io/projected/e69d02a9-477f-4281-bb15-469b21b21f7a-kube-api-access-w9g4l\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935407 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97f4b33d-62ab-442b-a11a-2c62f88c3b80-metrics-tls\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935441 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935465 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935488 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935509 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fe610b-235d-4252-9199-24c83fb3f457-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935530 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-client\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935551 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-config\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935556 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935568 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.936114 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.936832 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea68f924-76e2-4a91-82b7-90a3b194c011-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.935574 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea68f924-76e2-4a91-82b7-90a3b194c011-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937464 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b50e8580-b755-4535-9675-a167c40b6278-serving-cert\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937488 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7c2a5af-4204-4822-bec4-8589813d80df-audit-dir\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937512 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937535 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-serving-cert\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937568 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5tb\" (UniqueName: \"kubernetes.io/projected/9010f7b8-93e2-47e6-ab50-16ca7a9b337d-kube-api-access-cc5tb\") pod \"downloads-7954f5f757-v9zwm\" (UID: \"9010f7b8-93e2-47e6-ab50-16ca7a9b337d\") " pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937586 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937606 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-client\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937627 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqbw\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-kube-api-access-hkqbw\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937647 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7cn\" (UniqueName: \"kubernetes.io/projected/4b814105-58ac-41b6-8b52-efa5de815233-kube-api-access-sh7cn\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937669 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-node-pullsecrets\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937675 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-service-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937686 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvqh7\" (UniqueName: \"kubernetes.io/projected/d53b7ba5-49be-4aa3-87d6-89c74221cfda-kube-api-access-jvqh7\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937724 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c36becc-6886-4b68-8a62-9a857bd09359-proxy-tls\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937751 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937779 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937807 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-encryption-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937832 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-audit-dir\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937855 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbrm7\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-kube-api-access-tbrm7\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937879 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937903 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-metrics-certs\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937928 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937951 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-serving-cert\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.937973 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938000 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba8185d-5551-4919-88c7-8d98a1a955b6-config\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938024 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd9n\" (UniqueName: \"kubernetes.io/projected/5c36becc-6886-4b68-8a62-9a857bd09359-kube-api-access-xdd9n\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938160 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ba8185d-5551-4919-88c7-8d98a1a955b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938191 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938213 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-serving-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938235 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938256 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/39aef3e6-7314-4d82-8e9c-a83d505e022e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938281 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpr2f\" (UniqueName: \"kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938303 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3416d607-a1be-4dda-9d40-8cd6276002bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938327 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-auth-proxy-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938350 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba8185d-5551-4919-88c7-8d98a1a955b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938377 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938399 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85btv\" (UniqueName: \"kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938424 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3416d607-a1be-4dda-9d40-8cd6276002bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938448 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938472 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938511 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938536 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-images\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.938559 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c36becc-6886-4b68-8a62-9a857bd09359-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.939612 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.939830 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.939887 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.940630 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-audit\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.943097 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.943334 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-client\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.943531 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.943858 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.944002 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b50e8580-b755-4535-9675-a167c40b6278-serving-cert\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.944067 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7c2a5af-4204-4822-bec4-8589813d80df-audit-dir\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.944081 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.944476 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.944637 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.945155 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.945407 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-image-import-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.945629 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.945764 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.946283 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-serving-cert\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.946838 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-config\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.947099 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.947785 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56fe610b-235d-4252-9199-24c83fb3f457-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.948532 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.948871 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.948910 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.950357 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.950431 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.950710 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.950811 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b814105-58ac-41b6-8b52-efa5de815233-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.951136 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-audit-policies\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.951151 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.952377 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.952958 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4b814105-58ac-41b6-8b52-efa5de815233-images\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.955596 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.955659 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39aef3e6-7314-4d82-8e9c-a83d505e022e-serving-cert\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.956646 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.957011 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c6sbq"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.957152 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56fe610b-235d-4252-9199-24c83fb3f457-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.957305 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.957616 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.958140 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.958352 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.958526 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.960281 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-encryption-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.961067 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3416d607-a1be-4dda-9d40-8cd6276002bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.961474 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.961870 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.962283 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pm87n"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.962314 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b92mn"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.962398 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.962822 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mgxls"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.962889 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.963268 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/39aef3e6-7314-4d82-8e9c-a83d505e022e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.963795 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-etcd-client\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.964885 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.965629 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-audit-dir\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.966120 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.966377 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.966900 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50e8580-b755-4535-9675-a167c40b6278-config\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.977172 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-serving-cert\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.977186 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.977463 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.977880 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.977901 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.978126 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.978291 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-config\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.978785 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-etcd-serving-ca\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.979330 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.979369 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e69d02a9-477f-4281-bb15-469b21b21f7a-node-pullsecrets\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.982409 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-client\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.982628 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.982689 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jn8n"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.983406 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.983553 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.984874 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.985401 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7efeda68-504a-457c-8576-15a4eb8ffc86-etcd-service-ca\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.985522 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.985999 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e69d02a9-477f-4281-bb15-469b21b21f7a-config\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.986153 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.986187 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v9zwm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.986995 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.987180 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.987370 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.988108 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.988374 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.989837 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6jcgh"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.991922 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6kjm"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.991946 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pp9mp"] Nov 26 14:51:10 crc kubenswrapper[4651]: I1126 14:51:10.997731 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:10.999546 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.001252 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.002280 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.003237 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kps8x"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.005094 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.017629 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.017750 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.017806 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3416d607-a1be-4dda-9d40-8cd6276002bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.018920 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e7c2a5af-4204-4822-bec4-8589813d80df-encryption-config\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.020784 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.022318 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea68f924-76e2-4a91-82b7-90a3b194c011-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.024185 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.033426 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.038567 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-skjk9"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039491 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqbw\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-kube-api-access-hkqbw\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039548 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvqh7\" (UniqueName: \"kubernetes.io/projected/d53b7ba5-49be-4aa3-87d6-89c74221cfda-kube-api-access-jvqh7\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039591 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c36becc-6886-4b68-8a62-9a857bd09359-proxy-tls\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039610 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039631 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-metrics-certs\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039650 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba8185d-5551-4919-88c7-8d98a1a955b6-config\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039669 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039690 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd9n\" (UniqueName: \"kubernetes.io/projected/5c36becc-6886-4b68-8a62-9a857bd09359-kube-api-access-xdd9n\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039711 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ba8185d-5551-4919-88c7-8d98a1a955b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039738 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-auth-proxy-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039756 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba8185d-5551-4919-88c7-8d98a1a955b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039786 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c36becc-6886-4b68-8a62-9a857bd09359-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039807 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bb6d2ae9-7867-4995-97b5-33740c0de594-machine-approver-tls\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039826 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c80d47-92ec-4861-8936-289e6525a876-config\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039860 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9v4\" (UniqueName: \"kubernetes.io/projected/bb6d2ae9-7867-4995-97b5-33740c0de594-kube-api-access-rz9v4\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039897 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039918 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d53b7ba5-49be-4aa3-87d6-89c74221cfda-metrics-tls\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039939 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws65s\" (UniqueName: \"kubernetes.io/projected/7856b53a-287e-4c39-9f3f-0f384ecc84fe-kube-api-access-ws65s\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.039961 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4kx\" (UniqueName: \"kubernetes.io/projected/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-kube-api-access-xx4kx\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040000 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-default-certificate\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040027 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clff\" (UniqueName: \"kubernetes.io/projected/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-kube-api-access-4clff\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040077 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c80d47-92ec-4861-8936-289e6525a876-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040095 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97f4b33d-62ab-442b-a11a-2c62f88c3b80-trusted-ca\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040112 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-stats-auth\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040131 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040188 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7856b53a-287e-4c39-9f3f-0f384ecc84fe-service-ca-bundle\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040208 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c80d47-92ec-4861-8936-289e6525a876-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.040234 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97f4b33d-62ab-442b-a11a-2c62f88c3b80-metrics-tls\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.041360 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.041915 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bb6d2ae9-7867-4995-97b5-33740c0de594-auth-proxy-config\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.042093 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e69d02a9-477f-4281-bb15-469b21b21f7a-serving-cert\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.042433 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.045275 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.046891 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d53b7ba5-49be-4aa3-87d6-89c74221cfda-metrics-tls\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.047571 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bb6d2ae9-7867-4995-97b5-33740c0de594-machine-approver-tls\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.048626 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c36becc-6886-4b68-8a62-9a857bd09359-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.048669 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.050132 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.052958 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.053023 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.054167 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.055458 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.056399 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.057478 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.058245 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-96wpb"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.059025 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.059167 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.060300 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-br4rv"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.061264 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.062791 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b92mn"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.062821 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.062909 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.063173 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.064168 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.065327 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c6sbq"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.067321 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jzbt6"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.067838 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.068727 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nwdsz"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.069370 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.069905 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.071418 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.072900 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jn8n"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.073904 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.074806 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96wpb"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.075691 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-br4rv"] Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.081698 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.091962 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ba8185d-5551-4919-88c7-8d98a1a955b6-config\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.100846 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.104899 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ba8185d-5551-4919-88c7-8d98a1a955b6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.121399 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.141319 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.161644 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.180995 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.201297 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.221583 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.240911 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.262670 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.281688 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.293054 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97f4b33d-62ab-442b-a11a-2c62f88c3b80-metrics-tls\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.307838 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.315856 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97f4b33d-62ab-442b-a11a-2c62f88c3b80-trusted-ca\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.321610 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.341562 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.356272 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-metrics-certs\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.361548 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.381083 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.401323 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.401425 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.401841 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.402112 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.402122 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.416514 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-default-certificate\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.422253 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7856b53a-287e-4c39-9f3f-0f384ecc84fe-stats-auth\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.422373 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.441814 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.461009 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.465483 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7856b53a-287e-4c39-9f3f-0f384ecc84fe-service-ca-bundle\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.481112 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.501253 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.520953 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.528880 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c80d47-92ec-4861-8936-289e6525a876-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.542377 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.543962 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c80d47-92ec-4861-8936-289e6525a876-config\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.561299 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.573961 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c36becc-6886-4b68-8a62-9a857bd09359-proxy-tls\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.581274 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.641876 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.661306 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.668231 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.681391 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.701865 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.721845 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.741505 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.761613 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.781011 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.801437 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.821323 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.841149 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.860951 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.889109 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.901597 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.921111 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.940018 4651 request.go:700] Waited for 1.003638195s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-metrics&limit=500&resourceVersion=0 Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.942458 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.961116 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 14:51:11 crc kubenswrapper[4651]: I1126 14:51:11.988136 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.001473 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.034781 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lj6\" (UniqueName: \"kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6\") pod \"route-controller-manager-6576b87f9c-b8wlj\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.054748 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbp9\" (UniqueName: \"kubernetes.io/projected/b50e8580-b755-4535-9675-a167c40b6278-kube-api-access-dsbp9\") pod \"authentication-operator-69f744f599-6jcgh\" (UID: \"b50e8580-b755-4535-9675-a167c40b6278\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.074882 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcbc\" (UniqueName: \"kubernetes.io/projected/39aef3e6-7314-4d82-8e9c-a83d505e022e-kube-api-access-rqcbc\") pod \"openshift-config-operator-7777fb866f-kps8x\" (UID: \"39aef3e6-7314-4d82-8e9c-a83d505e022e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.094343 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfrf\" (UniqueName: \"kubernetes.io/projected/e7c2a5af-4204-4822-bec4-8589813d80df-kube-api-access-wlfrf\") pod \"apiserver-7bbb656c7d-j7sjm\" (UID: \"e7c2a5af-4204-4822-bec4-8589813d80df\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.119151 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.134203 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmv8\" (UniqueName: \"kubernetes.io/projected/ea68f924-76e2-4a91-82b7-90a3b194c011-kube-api-access-rdmv8\") pod \"openshift-controller-manager-operator-756b6f6bc6-sxkh9\" (UID: \"ea68f924-76e2-4a91-82b7-90a3b194c011\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.141456 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.141483 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.162294 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.183189 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.201439 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.221213 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.221617 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.243737 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.251769 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.261182 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.263571 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.301693 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.304407 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5tb\" (UniqueName: \"kubernetes.io/projected/9010f7b8-93e2-47e6-ab50-16ca7a9b337d-kube-api-access-cc5tb\") pod \"downloads-7954f5f757-v9zwm\" (UID: \"9010f7b8-93e2-47e6-ab50-16ca7a9b337d\") " pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.322741 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.358879 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbrm7\" (UniqueName: \"kubernetes.io/projected/3416d607-a1be-4dda-9d40-8cd6276002bd-kube-api-access-tbrm7\") pod \"cluster-image-registry-operator-dc59b4c8b-2gwmf\" (UID: \"3416d607-a1be-4dda-9d40-8cd6276002bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.360574 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.363675 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.382126 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.402318 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.422015 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.441600 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.460844 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.481853 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.499397 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6jcgh"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.500587 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: W1126 14:51:12.508900 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50e8580_b755_4535_9675_a167c40b6278.slice/crio-8a238e06efc97ff97266d119ae22089a29ddb0b758292c1ff487e7bf4b5b5d8d WatchSource:0}: Error finding container 8a238e06efc97ff97266d119ae22089a29ddb0b758292c1ff487e7bf4b5b5d8d: Status 404 returned error can't find the container with id 8a238e06efc97ff97266d119ae22089a29ddb0b758292c1ff487e7bf4b5b5d8d Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.524498 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.541310 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.544798 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.562843 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kps8x"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.570486 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 14:51:12 crc kubenswrapper[4651]: W1126 14:51:12.573247 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39aef3e6_7314_4d82_8e9c_a83d505e022e.slice/crio-fdd3b6c580259f6decaa68e775a55e40a67947774de22177893da9645863555e WatchSource:0}: Error finding container fdd3b6c580259f6decaa68e775a55e40a67947774de22177893da9645863555e: Status 404 returned error can't find the container with id fdd3b6c580259f6decaa68e775a55e40a67947774de22177893da9645863555e Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.575518 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.578310 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.582287 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: W1126 14:51:12.585859 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea68f924_76e2_4a91_82b7_90a3b194c011.slice/crio-ae4cf72d1d9c0e1173473175ad2da41e89312a259f955a34e0bae719b62ea925 WatchSource:0}: Error finding container ae4cf72d1d9c0e1173473175ad2da41e89312a259f955a34e0bae719b62ea925: Status 404 returned error can't find the container with id ae4cf72d1d9c0e1173473175ad2da41e89312a259f955a34e0bae719b62ea925 Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.601437 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.659222 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpr2f\" (UniqueName: \"kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f\") pod \"console-f9d7485db-q4qzb\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.661980 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz62r\" (UniqueName: \"kubernetes.io/projected/56fe610b-235d-4252-9199-24c83fb3f457-kube-api-access-hz62r\") pod \"openshift-apiserver-operator-796bbdcf4f-xc7c6\" (UID: \"56fe610b-235d-4252-9199-24c83fb3f457\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.662142 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.678865 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.682682 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.694818 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.709071 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.721821 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.762062 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfd8c\" (UniqueName: \"kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c\") pod \"oauth-openshift-558db77b4-9h5h8\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.780824 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7cn\" (UniqueName: \"kubernetes.io/projected/4b814105-58ac-41b6-8b52-efa5de815233-kube-api-access-sh7cn\") pod \"machine-api-operator-5694c8668f-skjk9\" (UID: \"4b814105-58ac-41b6-8b52-efa5de815233\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.795280 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v9zwm"] Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.798405 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhlzz\" (UniqueName: \"kubernetes.io/projected/7efeda68-504a-457c-8576-15a4eb8ffc86-kube-api-access-qhlzz\") pod \"etcd-operator-b45778765-pm87n\" (UID: \"7efeda68-504a-457c-8576-15a4eb8ffc86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.807308 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf"] Nov 26 14:51:12 crc kubenswrapper[4651]: W1126 14:51:12.819270 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9010f7b8_93e2_47e6_ab50_16ca7a9b337d.slice/crio-10e977ea7b0e44235bcb09d118b2a45ea87842e79739fd04f1bfc1e19007f13c WatchSource:0}: Error finding container 10e977ea7b0e44235bcb09d118b2a45ea87842e79739fd04f1bfc1e19007f13c: Status 404 returned error can't find the container with id 10e977ea7b0e44235bcb09d118b2a45ea87842e79739fd04f1bfc1e19007f13c Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.833719 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9g4l\" (UniqueName: \"kubernetes.io/projected/e69d02a9-477f-4281-bb15-469b21b21f7a-kube-api-access-w9g4l\") pod \"apiserver-76f77b778f-mgxls\" (UID: \"e69d02a9-477f-4281-bb15-469b21b21f7a\") " pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.840756 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85btv\" (UniqueName: \"kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv\") pod \"controller-manager-879f6c89f-fzwc6\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.841316 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.854086 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.862491 4651 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.868554 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.876668 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.881012 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.882126 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.928392 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqbw\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-kube-api-access-hkqbw\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.942688 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvqh7\" (UniqueName: \"kubernetes.io/projected/d53b7ba5-49be-4aa3-87d6-89c74221cfda-kube-api-access-jvqh7\") pod \"dns-operator-744455d44c-c6kjm\" (UID: \"d53b7ba5-49be-4aa3-87d6-89c74221cfda\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.962606 4651 request.go:700] Waited for 1.921130967s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-controller/token Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.962978 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" Nov 26 14:51:12 crc kubenswrapper[4651]: I1126 14:51:12.976781 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97f4b33d-62ab-442b-a11a-2c62f88c3b80-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bh4kq\" (UID: \"97f4b33d-62ab-442b-a11a-2c62f88c3b80\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.002978 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd9n\" (UniqueName: \"kubernetes.io/projected/5c36becc-6886-4b68-8a62-9a857bd09359-kube-api-access-xdd9n\") pod \"machine-config-controller-84d6567774-96sw4\" (UID: \"5c36becc-6886-4b68-8a62-9a857bd09359\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.011805 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ba8185d-5551-4919-88c7-8d98a1a955b6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-n264l\" (UID: \"8ba8185d-5551-4919-88c7-8d98a1a955b6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.028426 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9v4\" (UniqueName: \"kubernetes.io/projected/bb6d2ae9-7867-4995-97b5-33740c0de594-kube-api-access-rz9v4\") pod \"machine-approver-56656f9798-rln7f\" (UID: \"bb6d2ae9-7867-4995-97b5-33740c0de594\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.028564 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.058488 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4kx\" (UniqueName: \"kubernetes.io/projected/9c5376be-3ddd-4168-aed7-8ea2bc1fc97e-kube-api-access-xx4kx\") pod \"multus-admission-controller-857f4d67dd-pp9mp\" (UID: \"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.075205 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws65s\" (UniqueName: \"kubernetes.io/projected/7856b53a-287e-4c39-9f3f-0f384ecc84fe-kube-api-access-ws65s\") pod \"router-default-5444994796-vw9bz\" (UID: \"7856b53a-287e-4c39-9f3f-0f384ecc84fe\") " pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.077407 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clff\" (UniqueName: \"kubernetes.io/projected/fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76-kube-api-access-4clff\") pod \"cluster-samples-operator-665b6dd947-nhmkw\" (UID: \"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.077648 4651 generic.go:334] "Generic (PLEG): container finished" podID="39aef3e6-7314-4d82-8e9c-a83d505e022e" containerID="a2996fb16acf39cd296c4caf407d2b6560f68568181a3bdf513b8e6894fedb5a" exitCode=0 Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.077719 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" event={"ID":"39aef3e6-7314-4d82-8e9c-a83d505e022e","Type":"ContainerDied","Data":"a2996fb16acf39cd296c4caf407d2b6560f68568181a3bdf513b8e6894fedb5a"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.077759 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" event={"ID":"39aef3e6-7314-4d82-8e9c-a83d505e022e","Type":"ContainerStarted","Data":"fdd3b6c580259f6decaa68e775a55e40a67947774de22177893da9645863555e"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.080240 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" event={"ID":"e7c2a5af-4204-4822-bec4-8589813d80df","Type":"ContainerStarted","Data":"fe3dcab4b610f120a22fefd6d0cee81e1bbaf8aaececc2bbe85b91c48747b264"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.082621 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v9zwm" event={"ID":"9010f7b8-93e2-47e6-ab50-16ca7a9b337d","Type":"ContainerStarted","Data":"c12eeb6d9576ba6189752866196c175509b2b8f83c37c1114e932cf44d63975b"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.082670 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v9zwm" event={"ID":"9010f7b8-93e2-47e6-ab50-16ca7a9b337d","Type":"ContainerStarted","Data":"10e977ea7b0e44235bcb09d118b2a45ea87842e79739fd04f1bfc1e19007f13c"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.082683 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.084961 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" event={"ID":"b50e8580-b755-4535-9675-a167c40b6278","Type":"ContainerStarted","Data":"f14de4551951ccb71d3e12388ee1d2b53dbae451113ec11c635ab0aef6ca75f8"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.085000 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" event={"ID":"b50e8580-b755-4535-9675-a167c40b6278","Type":"ContainerStarted","Data":"8a238e06efc97ff97266d119ae22089a29ddb0b758292c1ff487e7bf4b5b5d8d"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.086602 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" event={"ID":"15df1010-c6ea-4bca-9a97-e6659866310f","Type":"ContainerStarted","Data":"9d72f5f8abcf53d078b82752e80535c3b233eef917918beb42481570bbed7650"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.086629 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" event={"ID":"15df1010-c6ea-4bca-9a97-e6659866310f","Type":"ContainerStarted","Data":"e4f9cfcf620b05eebd7cbf3dd217fb0b84407b7060a74d448388dd7c1c0baa2e"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.088403 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.092813 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" event={"ID":"3416d607-a1be-4dda-9d40-8cd6276002bd","Type":"ContainerStarted","Data":"1f0db91f1cd7cadb254a87b47847b65e59c9fa9999614a2188bc018265137bb7"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.092840 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" event={"ID":"3416d607-a1be-4dda-9d40-8cd6276002bd","Type":"ContainerStarted","Data":"2d68f680879817e2b6cf63ff24c9c2e732b750c2e81406a4cb364db62dc86286"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.096892 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6c80d47-92ec-4861-8936-289e6525a876-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bl9fp\" (UID: \"d6c80d47-92ec-4861-8936-289e6525a876\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.101362 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.109072 4651 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-b8wlj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.109119 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.109418 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.109440 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.119065 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" event={"ID":"ea68f924-76e2-4a91-82b7-90a3b194c011","Type":"ContainerStarted","Data":"799267c54792654f3abd5d3053138ab2148a31596a392469b7d00d38faad4bb0"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.119105 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" event={"ID":"ea68f924-76e2-4a91-82b7-90a3b194c011","Type":"ContainerStarted","Data":"ae4cf72d1d9c0e1173473175ad2da41e89312a259f955a34e0bae719b62ea925"} Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.121200 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.131235 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.144633 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.162858 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.183061 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.185984 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.189974 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.196941 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.202003 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.215288 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-skjk9"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.215408 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.221814 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.230148 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.235676 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.241813 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.242079 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.251737 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.265526 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.282417 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 14:51:13 crc kubenswrapper[4651]: W1126 14:51:13.293087 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b814105_58ac_41b6_8b52_efa5de815233.slice/crio-06ab480e824cb78ae9a38ebed73871a23fae40ecd844bcaa3b02705d20ae60c1 WatchSource:0}: Error finding container 06ab480e824cb78ae9a38ebed73871a23fae40ecd844bcaa3b02705d20ae60c1: Status 404 returned error can't find the container with id 06ab480e824cb78ae9a38ebed73871a23fae40ecd844bcaa3b02705d20ae60c1 Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.301358 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.315312 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.325155 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.346589 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.350155 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.366348 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.382084 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 14:51:13 crc kubenswrapper[4651]: W1126 14:51:13.397376 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74cd140b_bb74_4152_bb6f_0a42f92c864e.slice/crio-b16eec6816a403c6deb6edf40af131b29cdec029d85256ce0b42d77bb49e5867 WatchSource:0}: Error finding container b16eec6816a403c6deb6edf40af131b29cdec029d85256ce0b42d77bb49e5867: Status 404 returned error can't find the container with id b16eec6816a403c6deb6edf40af131b29cdec029d85256ce0b42d77bb49e5867 Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.403640 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.422931 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.453004 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482731 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482775 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482801 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482819 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482852 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482890 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482909 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482924 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdnt\" (UniqueName: \"kubernetes.io/projected/1bb12b15-c889-47bf-9e26-25196edb90e0-kube-api-access-qsdnt\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482940 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjk9n\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482955 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb12b15-c889-47bf-9e26-25196edb90e0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.482970 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb12b15-c889-47bf-9e26-25196edb90e0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.483767 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:13.983755396 +0000 UTC m=+41.409503000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: W1126 14:51:13.498139 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1058c7_8ca9_41f7_b961_0b48e973c6c6.slice/crio-3e36e53bfe1f4916cd7c6b423c0d27dd0a4c9190a2a6150a9866ab43964b638d WatchSource:0}: Error finding container 3e36e53bfe1f4916cd7c6b423c0d27dd0a4c9190a2a6150a9866ab43964b638d: Status 404 returned error can't find the container with id 3e36e53bfe1f4916cd7c6b423c0d27dd0a4c9190a2a6150a9866ab43964b638d Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.509405 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.552032 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.571094 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mgxls"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.575209 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584492 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584630 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584679 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdnt\" (UniqueName: \"kubernetes.io/projected/1bb12b15-c889-47bf-9e26-25196edb90e0-kube-api-access-qsdnt\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584728 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-key\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584774 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjk9n\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584811 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-profile-collector-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584828 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584844 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-serving-cert\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584881 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb12b15-c889-47bf-9e26-25196edb90e0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584899 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb12b15-c889-47bf-9e26-25196edb90e0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584914 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50268c7a-5457-412a-8233-f8045815e7bf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584959 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50268c7a-5457-412a-8233-f8045815e7bf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584978 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsnr\" (UniqueName: \"kubernetes.io/projected/7af18cbb-d68f-4d02-9556-62ea57ed250f-kube-api-access-6nsnr\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.584992 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585017 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e11728f-727f-4ed6-8c97-7d3383fb0db1-proxy-tls\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585124 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585156 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6hk\" (UniqueName: \"kubernetes.io/projected/df7a5934-8cbe-48de-badf-a0bf93119820-kube-api-access-8g6hk\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585221 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/841bb384-9633-40c9-883b-109faa7e681d-cert\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585237 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cebd2b5f-d730-498e-a779-ad053c71f5ff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585351 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7qh\" (UniqueName: \"kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585389 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-kube-api-access-dncsd\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.585457 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8rg\" (UniqueName: \"kubernetes.io/projected/a52c4031-f252-41d9-9904-eb7e9f78d501-kube-api-access-2f8rg\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586263 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7a5934-8cbe-48de-badf-a0bf93119820-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586288 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k22tk\" (UniqueName: \"kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586305 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-mountpoint-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586448 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586471 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-images\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586732 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.586797 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-config\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587084 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-registration-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587169 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587191 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587237 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-plugins-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587257 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50268c7a-5457-412a-8233-f8045815e7bf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587271 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-trusted-ca\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587513 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-node-bootstrap-token\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587533 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcww8\" (UniqueName: \"kubernetes.io/projected/2d474cd7-8d0f-40f2-b125-94c074eee3c2-kube-api-access-zcww8\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgwtz\" (UniqueName: \"kubernetes.io/projected/cebd2b5f-d730-498e-a779-ad053c71f5ff-kube-api-access-zgwtz\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587721 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587847 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l597\" (UniqueName: \"kubernetes.io/projected/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-kube-api-access-6l597\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.587873 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgt6\" (UniqueName: \"kubernetes.io/projected/7e11728f-727f-4ed6-8c97-7d3383fb0db1-kube-api-access-qsgt6\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.588070 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.08805323 +0000 UTC m=+41.513800834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588088 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skjbs\" (UniqueName: \"kubernetes.io/projected/59ae7330-c38e-426b-8781-4115ebed8c71-kube-api-access-skjbs\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588110 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c4031-f252-41d9-9904-eb7e9f78d501-config\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588346 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588384 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-csi-data-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588413 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ae7330-c38e-426b-8781-4115ebed8c71-config-volume\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588432 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.588737 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8dk8\" (UniqueName: \"kubernetes.io/projected/b19dd389-5b85-4864-b4d7-bcf3222b1061-kube-api-access-m8dk8\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.589901 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.589975 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.595756 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596596 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596635 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswxq\" (UniqueName: \"kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596670 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-srv-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596716 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5kvx\" (UniqueName: \"kubernetes.io/projected/c9862b5b-24f3-41bc-aae6-36f911cf57a0-kube-api-access-m5kvx\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596770 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-certs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596787 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b19dd389-5b85-4864-b4d7-bcf3222b1061-tmpfs\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596825 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596880 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52c4031-f252-41d9-9904-eb7e9f78d501-serving-cert\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596907 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvghs\" (UniqueName: \"kubernetes.io/projected/0405d04c-84c3-4ea7-8efe-9216684f0f97-kube-api-access-hvghs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596924 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7cz6\" (UniqueName: \"kubernetes.io/projected/415988cf-7b69-44c8-a978-2ba440a2196b-kube-api-access-b7cz6\") pod \"migrator-59844c95c7-s4nz5\" (UID: \"415988cf-7b69-44c8-a978-2ba440a2196b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596959 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-cabundle\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.596978 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-socket-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.597024 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-webhook-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.597064 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ae7330-c38e-426b-8781-4115ebed8c71-metrics-tls\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.597594 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.597728 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-apiservice-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.598502 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-srv-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.598541 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.598585 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlck\" (UniqueName: \"kubernetes.io/projected/841bb384-9633-40c9-883b-109faa7e681d-kube-api-access-rmlck\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.606160 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: W1126 14:51:13.613439 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7856b53a_287e_4c39_9f3f_0f384ecc84fe.slice/crio-8f9ae4efa9251d95eac4ffbe2b3cfbfe4d79b3506b2200f3e3bf95ec1a07b698 WatchSource:0}: Error finding container 8f9ae4efa9251d95eac4ffbe2b3cfbfe4d79b3506b2200f3e3bf95ec1a07b698: Status 404 returned error can't find the container with id 8f9ae4efa9251d95eac4ffbe2b3cfbfe4d79b3506b2200f3e3bf95ec1a07b698 Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.618007 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.117107735 +0000 UTC m=+41.542855429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.630648 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pm87n"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.639166 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb12b15-c889-47bf-9e26-25196edb90e0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.639925 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.680747 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjk9n\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.681515 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.682100 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdnt\" (UniqueName: \"kubernetes.io/projected/1bb12b15-c889-47bf-9e26-25196edb90e0-kube-api-access-qsdnt\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.697946 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb12b15-c889-47bf-9e26-25196edb90e0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zl5lt\" (UID: \"1bb12b15-c889-47bf-9e26-25196edb90e0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702327 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702437 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-images\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702471 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-config\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702497 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-registration-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702524 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702540 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702556 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-plugins-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702574 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50268c7a-5457-412a-8233-f8045815e7bf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702594 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-trusted-ca\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702619 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-node-bootstrap-token\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702634 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcww8\" (UniqueName: \"kubernetes.io/projected/2d474cd7-8d0f-40f2-b125-94c074eee3c2-kube-api-access-zcww8\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702657 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgwtz\" (UniqueName: \"kubernetes.io/projected/cebd2b5f-d730-498e-a779-ad053c71f5ff-kube-api-access-zgwtz\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702680 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l597\" (UniqueName: \"kubernetes.io/projected/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-kube-api-access-6l597\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702704 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgt6\" (UniqueName: \"kubernetes.io/projected/7e11728f-727f-4ed6-8c97-7d3383fb0db1-kube-api-access-qsgt6\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702725 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skjbs\" (UniqueName: \"kubernetes.io/projected/59ae7330-c38e-426b-8781-4115ebed8c71-kube-api-access-skjbs\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702743 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c4031-f252-41d9-9904-eb7e9f78d501-config\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702767 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-csi-data-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702794 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ae7330-c38e-426b-8781-4115ebed8c71-config-volume\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702818 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702841 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702862 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8dk8\" (UniqueName: \"kubernetes.io/projected/b19dd389-5b85-4864-b4d7-bcf3222b1061-kube-api-access-m8dk8\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702885 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-srv-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702907 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5kvx\" (UniqueName: \"kubernetes.io/projected/c9862b5b-24f3-41bc-aae6-36f911cf57a0-kube-api-access-m5kvx\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702929 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswxq\" (UniqueName: \"kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702962 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-certs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.702983 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b19dd389-5b85-4864-b4d7-bcf3222b1061-tmpfs\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703006 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52c4031-f252-41d9-9904-eb7e9f78d501-serving-cert\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703027 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvghs\" (UniqueName: \"kubernetes.io/projected/0405d04c-84c3-4ea7-8efe-9216684f0f97-kube-api-access-hvghs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703070 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7cz6\" (UniqueName: \"kubernetes.io/projected/415988cf-7b69-44c8-a978-2ba440a2196b-kube-api-access-b7cz6\") pod \"migrator-59844c95c7-s4nz5\" (UID: \"415988cf-7b69-44c8-a978-2ba440a2196b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703091 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-cabundle\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703120 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-socket-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703139 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-webhook-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703161 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ae7330-c38e-426b-8781-4115ebed8c71-metrics-tls\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703206 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703238 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-apiservice-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703261 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-srv-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703293 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlck\" (UniqueName: \"kubernetes.io/projected/841bb384-9633-40c9-883b-109faa7e681d-kube-api-access-rmlck\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703315 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-key\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703337 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-profile-collector-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703359 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703381 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50268c7a-5457-412a-8233-f8045815e7bf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703400 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-serving-cert\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703422 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50268c7a-5457-412a-8233-f8045815e7bf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703444 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsnr\" (UniqueName: \"kubernetes.io/projected/7af18cbb-d68f-4d02-9556-62ea57ed250f-kube-api-access-6nsnr\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703454 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-csi-data-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703469 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703539 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703559 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e11728f-727f-4ed6-8c97-7d3383fb0db1-proxy-tls\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703584 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703620 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6hk\" (UniqueName: \"kubernetes.io/projected/df7a5934-8cbe-48de-badf-a0bf93119820-kube-api-access-8g6hk\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703646 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/841bb384-9633-40c9-883b-109faa7e681d-cert\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703667 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cebd2b5f-d730-498e-a779-ad053c71f5ff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703694 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7qh\" (UniqueName: \"kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703725 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-kube-api-access-dncsd\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703751 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8rg\" (UniqueName: \"kubernetes.io/projected/a52c4031-f252-41d9-9904-eb7e9f78d501-kube-api-access-2f8rg\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703794 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7a5934-8cbe-48de-badf-a0bf93119820-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703820 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k22tk\" (UniqueName: \"kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703840 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-mountpoint-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.703862 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.704246 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ae7330-c38e-426b-8781-4115ebed8c71-config-volume\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.704879 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.705796 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.205756282 +0000 UTC m=+41.631503956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.706415 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-images\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.707073 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-config\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.707316 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-registration-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.708316 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b19dd389-5b85-4864-b4d7-bcf3222b1061-tmpfs\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.709187 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.714789 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.714884 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-trusted-ca\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.715417 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.715690 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-cabundle\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.715758 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-socket-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.716759 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-plugins-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.717705 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-webhook-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.719423 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-srv-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.721023 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50268c7a-5457-412a-8233-f8045815e7bf-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.723185 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.723609 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.728681 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52c4031-f252-41d9-9904-eb7e9f78d501-config\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.730431 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ae7330-c38e-426b-8781-4115ebed8c71-metrics-tls\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.736826 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b19dd389-5b85-4864-b4d7-bcf3222b1061-apiservice-cert\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.741556 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d474cd7-8d0f-40f2-b125-94c074eee3c2-mountpoint-dir\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.742069 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-serving-cert\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.742464 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-srv-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.744180 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.770974 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-certs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.774202 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7af18cbb-d68f-4d02-9556-62ea57ed250f-profile-collector-cert\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.775144 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcww8\" (UniqueName: \"kubernetes.io/projected/2d474cd7-8d0f-40f2-b125-94c074eee3c2-kube-api-access-zcww8\") pod \"csi-hostpathplugin-2jn8n\" (UID: \"2d474cd7-8d0f-40f2-b125-94c074eee3c2\") " pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.777862 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7a5934-8cbe-48de-badf-a0bf93119820-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.778339 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50268c7a-5457-412a-8233-f8045815e7bf-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.779154 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cebd2b5f-d730-498e-a779-ad053c71f5ff-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.788611 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9862b5b-24f3-41bc-aae6-36f911cf57a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.800762 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52c4031-f252-41d9-9904-eb7e9f78d501-serving-cert\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.801662 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7e11728f-727f-4ed6-8c97-7d3383fb0db1-proxy-tls\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.814576 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.815528 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.315391975 +0000 UTC m=+41.741139579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.825438 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7e11728f-727f-4ed6-8c97-7d3383fb0db1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.857447 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.877622 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-signing-key\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.877700 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvghs\" (UniqueName: \"kubernetes.io/projected/0405d04c-84c3-4ea7-8efe-9216684f0f97-kube-api-access-hvghs\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.879911 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5kvx\" (UniqueName: \"kubernetes.io/projected/c9862b5b-24f3-41bc-aae6-36f911cf57a0-kube-api-access-m5kvx\") pod \"olm-operator-6b444d44fb-4xztb\" (UID: \"c9862b5b-24f3-41bc-aae6-36f911cf57a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.881763 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.886472 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/841bb384-9633-40c9-883b-109faa7e681d-cert\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.889887 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7cz6\" (UniqueName: \"kubernetes.io/projected/415988cf-7b69-44c8-a978-2ba440a2196b-kube-api-access-b7cz6\") pod \"migrator-59844c95c7-s4nz5\" (UID: \"415988cf-7b69-44c8-a978-2ba440a2196b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.895939 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgwtz\" (UniqueName: \"kubernetes.io/projected/cebd2b5f-d730-498e-a779-ad053c71f5ff-kube-api-access-zgwtz\") pod \"package-server-manager-789f6589d5-7zf4d\" (UID: \"cebd2b5f-d730-498e-a779-ad053c71f5ff\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.896833 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.897598 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6kjm"] Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.905709 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0405d04c-84c3-4ea7-8efe-9216684f0f97-node-bootstrap-token\") pod \"machine-config-server-nwdsz\" (UID: \"0405d04c-84c3-4ea7-8efe-9216684f0f97\") " pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.910878 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l597\" (UniqueName: \"kubernetes.io/projected/990e931d-e194-4eb0-91dc-d2d7ea0f8d4e-kube-api-access-6l597\") pod \"service-ca-9c57cc56f-b92mn\" (UID: \"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e\") " pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.911091 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8dk8\" (UniqueName: \"kubernetes.io/projected/b19dd389-5b85-4864-b4d7-bcf3222b1061-kube-api-access-m8dk8\") pod \"packageserver-d55dfcdfc-d2wqw\" (UID: \"b19dd389-5b85-4864-b4d7-bcf3222b1061\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: W1126 14:51:13.915713 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c80d47_92ec_4861_8936_289e6525a876.slice/crio-8904ca6830ca1aed4be26e674ee03a11c65924fe061731bf38dc693a64c3fd4a WatchSource:0}: Error finding container 8904ca6830ca1aed4be26e674ee03a11c65924fe061731bf38dc693a64c3fd4a: Status 404 returned error can't find the container with id 8904ca6830ca1aed4be26e674ee03a11c65924fe061731bf38dc693a64c3fd4a Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.928189 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.928820 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.931727 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswxq\" (UniqueName: \"kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq\") pod \"cni-sysctl-allowlist-ds-jzbt6\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.931947 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.933792 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skjbs\" (UniqueName: \"kubernetes.io/projected/59ae7330-c38e-426b-8781-4115ebed8c71-kube-api-access-skjbs\") pod \"dns-default-br4rv\" (UID: \"59ae7330-c38e-426b-8781-4115ebed8c71\") " pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:13 crc kubenswrapper[4651]: E1126 14:51:13.934899 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.434878344 +0000 UTC m=+41.860625948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.941371 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgt6\" (UniqueName: \"kubernetes.io/projected/7e11728f-727f-4ed6-8c97-7d3383fb0db1-kube-api-access-qsgt6\") pod \"machine-config-operator-74547568cd-4wgpt\" (UID: \"7e11728f-727f-4ed6-8c97-7d3383fb0db1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.962152 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlck\" (UniqueName: \"kubernetes.io/projected/841bb384-9633-40c9-883b-109faa7e681d-kube-api-access-rmlck\") pod \"ingress-canary-96wpb\" (UID: \"841bb384-9633-40c9-883b-109faa7e681d\") " pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.967637 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncsd\" (UniqueName: \"kubernetes.io/projected/af7dcbad-1236-40d4-9c4b-0fa57fb76df3-kube-api-access-dncsd\") pod \"console-operator-58897d9998-c6sbq\" (UID: \"af7dcbad-1236-40d4-9c4b-0fa57fb76df3\") " pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.972731 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.987226 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8rg\" (UniqueName: \"kubernetes.io/projected/a52c4031-f252-41d9-9904-eb7e9f78d501-kube-api-access-2f8rg\") pod \"service-ca-operator-777779d784-tmtmr\" (UID: \"a52c4031-f252-41d9-9904-eb7e9f78d501\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:13 crc kubenswrapper[4651]: I1126 14:51:13.996705 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.018678 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96wpb" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.019552 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.030201 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.032516 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.032916 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.532900795 +0000 UTC m=+41.958648399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.033857 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k22tk\" (UniqueName: \"kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk\") pod \"marketplace-operator-79b997595-jgc22\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.038064 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nwdsz" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.039524 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7qh\" (UniqueName: \"kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh\") pod \"collect-profiles-29402805-z6w5m\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.067678 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50268c7a-5457-412a-8233-f8045815e7bf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xfr4\" (UID: \"50268c7a-5457-412a-8233-f8045815e7bf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.085179 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.105599 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsnr\" (UniqueName: \"kubernetes.io/projected/7af18cbb-d68f-4d02-9556-62ea57ed250f-kube-api-access-6nsnr\") pod \"catalog-operator-68c6474976-rwlv8\" (UID: \"7af18cbb-d68f-4d02-9556-62ea57ed250f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.107421 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6hk\" (UniqueName: \"kubernetes.io/projected/df7a5934-8cbe-48de-badf-a0bf93119820-kube-api-access-8g6hk\") pod \"control-plane-machine-set-operator-78cbb6b69f-skz22\" (UID: \"df7a5934-8cbe-48de-badf-a0bf93119820\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.133388 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.133676 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.633660266 +0000 UTC m=+42.059407870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.137355 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pp9mp"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.137956 4651 generic.go:334] "Generic (PLEG): container finished" podID="e7c2a5af-4204-4822-bec4-8589813d80df" containerID="be71765b0e09b6f3194901831bf60cf43ad6942703773588263280a070b27fb9" exitCode=0 Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.138012 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" event={"ID":"e7c2a5af-4204-4822-bec4-8589813d80df","Type":"ContainerDied","Data":"be71765b0e09b6f3194901831bf60cf43ad6942703773588263280a070b27fb9"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.140342 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.153260 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" event={"ID":"d6c80d47-92ec-4861-8936-289e6525a876","Type":"ContainerStarted","Data":"8904ca6830ca1aed4be26e674ee03a11c65924fe061731bf38dc693a64c3fd4a"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.160635 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.167487 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.177340 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vw9bz" event={"ID":"7856b53a-287e-4c39-9f3f-0f384ecc84fe","Type":"ContainerStarted","Data":"8f9ae4efa9251d95eac4ffbe2b3cfbfe4d79b3506b2200f3e3bf95ec1a07b698"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.177585 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.179765 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.193912 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.194277 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" event={"ID":"bb6d2ae9-7867-4995-97b5-33740c0de594","Type":"ContainerStarted","Data":"64b50b46505396ec73962730b37c6ea6a6159332f251ba0f787c942f4af9b0d7"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.198427 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" event={"ID":"d53b7ba5-49be-4aa3-87d6-89c74221cfda","Type":"ContainerStarted","Data":"9aa64ca9abc30ba7d6f4de664ddf28ee76e11509680d632549c75906a60eb01d"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.204403 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" event={"ID":"56fe610b-235d-4252-9199-24c83fb3f457","Type":"ContainerStarted","Data":"f9e68500325f44fb1cd34153e65e56d48bdd08584077f507c4260d8a067ba645"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.206400 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.206942 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q4qzb" event={"ID":"74cd140b-bb74-4152-bb6f-0a42f92c864e","Type":"ContainerStarted","Data":"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.207105 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q4qzb" event={"ID":"74cd140b-bb74-4152-bb6f-0a42f92c864e","Type":"ContainerStarted","Data":"b16eec6816a403c6deb6edf40af131b29cdec029d85256ce0b42d77bb49e5867"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.209875 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" event={"ID":"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76","Type":"ContainerStarted","Data":"23633cb23c053ee4123b5056ae807dccbb9f6100d8806b4cd88b07a56db4ce3e"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.214107 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" event={"ID":"7efeda68-504a-457c-8576-15a4eb8ffc86","Type":"ContainerStarted","Data":"bf94e34cd1593f8e96a9d4e921f866e3b4ac608d9bd3125ec1be1995e45a0885"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.219533 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" event={"ID":"4b814105-58ac-41b6-8b52-efa5de815233","Type":"ContainerStarted","Data":"26db83535a5489e1b6a9a21bf3066ad9291bcc13da5488b712702b81ec85ecef"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.219582 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" event={"ID":"4b814105-58ac-41b6-8b52-efa5de815233","Type":"ContainerStarted","Data":"06ab480e824cb78ae9a38ebed73871a23fae40ecd844bcaa3b02705d20ae60c1"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.222771 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" event={"ID":"39aef3e6-7314-4d82-8e9c-a83d505e022e","Type":"ContainerStarted","Data":"01a6be6a7355b85ec623386b05234d642617aff841a83c453befb62aac29e12f"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.223436 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.225531 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" event={"ID":"1b1058c7-8ca9-41f7-b961-0b48e973c6c6","Type":"ContainerStarted","Data":"3e36e53bfe1f4916cd7c6b423c0d27dd0a4c9190a2a6150a9866ab43964b638d"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.235214 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.235230 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.235604 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.735590168 +0000 UTC m=+42.161337772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.241495 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" event={"ID":"e69d02a9-477f-4281-bb15-469b21b21f7a","Type":"ContainerStarted","Data":"c288e5c5b7a7527c8ffb4b8d31c4d8311e5c0253b3ba65ca227c2fd8bc8e5891"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.246704 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.252130 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" event={"ID":"916a34e5-fa74-4e59-9deb-18a4067f007b","Type":"ContainerStarted","Data":"f572855ef1a897e7dfef9eb7a6a60dfbcb16ce607f7a08e0473e96ce186e3ac1"} Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.253896 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.253924 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.256462 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.335718 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.336698 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.836684408 +0000 UTC m=+42.262432012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.346598 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.437060 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.438636 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:14.938621541 +0000 UTC m=+42.364369145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.485835 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2gwmf" podStartSLOduration=16.485814409 podStartE2EDuration="16.485814409s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:14.457890463 +0000 UTC m=+41.883638097" watchObservedRunningTime="2025-11-26 14:51:14.485814409 +0000 UTC m=+41.911562013" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.516622 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.520949 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6jcgh" podStartSLOduration=16.520927413 podStartE2EDuration="16.520927413s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:14.490253794 +0000 UTC m=+41.916001398" watchObservedRunningTime="2025-11-26 14:51:14.520927413 +0000 UTC m=+41.946675017" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.531498 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.538688 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.540384 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.040360048 +0000 UTC m=+42.466107652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.540984 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.541384 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.041372854 +0000 UTC m=+42.467120458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.605489 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" podStartSLOduration=16.605463922 podStartE2EDuration="16.605463922s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:14.600223866 +0000 UTC m=+42.025971490" watchObservedRunningTime="2025-11-26 14:51:14.605463922 +0000 UTC m=+42.031211546" Nov 26 14:51:14 crc kubenswrapper[4651]: W1126 14:51:14.613814 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f4b33d_62ab_442b_a11a_2c62f88c3b80.slice/crio-73c340734918ec5a9f5c947e4faef444887480069584ab53766830e3f4176830 WatchSource:0}: Error finding container 73c340734918ec5a9f5c947e4faef444887480069584ab53766830e3f4176830: Status 404 returned error can't find the container with id 73c340734918ec5a9f5c947e4faef444887480069584ab53766830e3f4176830 Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.653462 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.653706 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.654445 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.154415676 +0000 UTC m=+42.580163330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.672931 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46f059e4-ddf4-4e21-b528-0cc9cec8afa1-metrics-certs\") pod \"network-metrics-daemon-79fzh\" (UID: \"46f059e4-ddf4-4e21-b528-0cc9cec8afa1\") " pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.744283 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-br4rv"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.752092 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79fzh" Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.760678 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.760992 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.260977709 +0000 UTC m=+42.686725313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.833944 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96wpb"] Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.866930 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.867635 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.367598633 +0000 UTC m=+42.793346247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.868375 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.868926 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.368910187 +0000 UTC m=+42.794657791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:14 crc kubenswrapper[4651]: I1126 14:51:14.969936 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:14 crc kubenswrapper[4651]: E1126 14:51:14.970328 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.470306895 +0000 UTC m=+42.896054499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.072080 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.072765 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.572748721 +0000 UTC m=+42.998496325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.176111 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.176378 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.676362707 +0000 UTC m=+43.102110311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.178488 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.231388 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.259517 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b92mn"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.278934 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.279960 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.779944362 +0000 UTC m=+43.205691966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.360155 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2jn8n"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.373166 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sxkh9" podStartSLOduration=17.373150497 podStartE2EDuration="17.373150497s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:15.324421549 +0000 UTC m=+42.750169153" watchObservedRunningTime="2025-11-26 14:51:15.373150497 +0000 UTC m=+42.798898101" Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.390983 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.391468 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.891448803 +0000 UTC m=+43.317196407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.492718 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.493287 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:15.993270512 +0000 UTC m=+43.419018126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.507422 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.507471 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.514232 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" event={"ID":"5c36becc-6886-4b68-8a62-9a857bd09359","Type":"ContainerStarted","Data":"f3c696a5ecb38a15a355d3beb8342c5294a3ac1c10e5df6d44c0ffc5a6c9aba6"} Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.526325 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" podStartSLOduration=16.526305692 podStartE2EDuration="16.526305692s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:15.524184827 +0000 UTC m=+42.949932441" watchObservedRunningTime="2025-11-26 14:51:15.526305692 +0000 UTC m=+42.952053296" Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.552856 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.596289 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.596512 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.096494368 +0000 UTC m=+43.522241972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.617705 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q4qzb" podStartSLOduration=17.61768541 podStartE2EDuration="17.61768541s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:15.573950302 +0000 UTC m=+42.999697926" watchObservedRunningTime="2025-11-26 14:51:15.61768541 +0000 UTC m=+43.043433014" Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.629705 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-br4rv" event={"ID":"59ae7330-c38e-426b-8781-4115ebed8c71","Type":"ContainerStarted","Data":"5bc558234158eefb0ea57913238a9348f276939cbec03b313ccc482c525e7c6e"} Nov 26 14:51:15 crc kubenswrapper[4651]: W1126 14:51:15.643972 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d474cd7_8d0f_40f2_b125_94c074eee3c2.slice/crio-2a398315bd45ee62cb36ca427a26e3083c2a37cb80a1a1213ae6321ad3f8b688 WatchSource:0}: Error finding container 2a398315bd45ee62cb36ca427a26e3083c2a37cb80a1a1213ae6321ad3f8b688: Status 404 returned error can't find the container with id 2a398315bd45ee62cb36ca427a26e3083c2a37cb80a1a1213ae6321ad3f8b688 Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.700344 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.700745 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.20073052 +0000 UTC m=+43.626478124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.804296 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" event={"ID":"916a34e5-fa74-4e59-9deb-18a4067f007b","Type":"ContainerStarted","Data":"66683b0bb1d1c80326bec688842c05f41bf6b0b90809b93a8a36e3fe4b058e2d"} Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.804575 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.804885 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.804950 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.304933221 +0000 UTC m=+43.730680825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.821202 4651 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fzwc6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.821262 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.827878 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-v9zwm" podStartSLOduration=17.826804281 podStartE2EDuration="17.826804281s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:15.822252763 +0000 UTC m=+43.248000377" watchObservedRunningTime="2025-11-26 14:51:15.826804281 +0000 UTC m=+43.252551885" Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.850852 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.895265 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" event={"ID":"56fe610b-235d-4252-9199-24c83fb3f457","Type":"ContainerStarted","Data":"279914acd041676f7b8439ca988f0f113b0de4a7346f1a1c13c9361a1f148322"} Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.906207 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:15 crc kubenswrapper[4651]: E1126 14:51:15.906759 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.406735101 +0000 UTC m=+43.832482705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:15 crc kubenswrapper[4651]: W1126 14:51:15.931761 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7a5934_8cbe_48de_badf_a0bf93119820.slice/crio-199ba85e5a6894f59fbe499e46db929e4c97da114c4896bc7acf75d5c7eba0f1 WatchSource:0}: Error finding container 199ba85e5a6894f59fbe499e46db929e4c97da114c4896bc7acf75d5c7eba0f1: Status 404 returned error can't find the container with id 199ba85e5a6894f59fbe499e46db929e4c97da114c4896bc7acf75d5c7eba0f1 Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.951804 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:51:15 crc kubenswrapper[4651]: I1126 14:51:15.986855 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.017223 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.017504 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.517476462 +0000 UTC m=+43.943224116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.017564 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.019025 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" podStartSLOduration=18.019000012 podStartE2EDuration="18.019000012s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:16.01664449 +0000 UTC m=+43.442392114" watchObservedRunningTime="2025-11-26 14:51:16.019000012 +0000 UTC m=+43.444747616" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.019188 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.519172396 +0000 UTC m=+43.944920000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.025164 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vw9bz" event={"ID":"7856b53a-287e-4c39-9f3f-0f384ecc84fe","Type":"ContainerStarted","Data":"3c1d66c87238ba05a8f14b8e39f3f25d769063d5723d9cc94d50318b8898e622"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.027730 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.046613 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" event={"ID":"8ba8185d-5551-4919-88c7-8d98a1a955b6","Type":"ContainerStarted","Data":"450710c0b0c905286d540e63a0a022002db9b101ce416085b17ae1b42617d015"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.067426 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" event={"ID":"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e","Type":"ContainerStarted","Data":"dacfc549bf6aa7c85a688f6b3fd4967ebe6a5345de555ad767efa0920d1ec9c2"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.119951 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.120450 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.620434261 +0000 UTC m=+44.046181865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.124364 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" event={"ID":"cebd2b5f-d730-498e-a779-ad053c71f5ff","Type":"ContainerStarted","Data":"722a3096d98c6bd8fdc0d627cdaaba90af74c086d332554f8b2a0fe3fb43e60b"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.139199 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" event={"ID":"1b1058c7-8ca9-41f7-b961-0b48e973c6c6","Type":"ContainerStarted","Data":"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.141046 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.141697 4651 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9h5h8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.141726 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.160744 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c6sbq"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.175291 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.206898 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" event={"ID":"97f4b33d-62ab-442b-a11a-2c62f88c3b80","Type":"ContainerStarted","Data":"73c340734918ec5a9f5c947e4faef444887480069584ab53766830e3f4176830"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.236903 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.241361 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.241918 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.741903651 +0000 UTC m=+44.167651255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.255253 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" event={"ID":"bb6d2ae9-7867-4995-97b5-33740c0de594","Type":"ContainerStarted","Data":"3ea86c5c9f8b3caaaf9ec1d3632830a1b4c2b54c8e7744b1cecf9e074b9e904a"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.264704 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96wpb" event={"ID":"841bb384-9633-40c9-883b-109faa7e681d","Type":"ContainerStarted","Data":"01fe21c57faaf65e197a2c413b7fb186817dac337a654c2bc1de61c62bbf14e2"} Nov 26 14:51:16 crc kubenswrapper[4651]: W1126 14:51:16.293163 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52c4031_f252_41d9_9904_eb7e9f78d501.slice/crio-c1656c3d19046a76bb64fc32ca86652ef89fc2fa5ca49862559b007458d9d6d3 WatchSource:0}: Error finding container c1656c3d19046a76bb64fc32ca86652ef89fc2fa5ca49862559b007458d9d6d3: Status 404 returned error can't find the container with id c1656c3d19046a76bb64fc32ca86652ef89fc2fa5ca49862559b007458d9d6d3 Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.294878 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" podStartSLOduration=18.294863029 podStartE2EDuration="18.294863029s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:16.287601591 +0000 UTC m=+43.713349195" watchObservedRunningTime="2025-11-26 14:51:16.294863029 +0000 UTC m=+43.720610643" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.325676 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xc7c6" podStartSLOduration=18.3256625 podStartE2EDuration="18.3256625s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:16.324257744 +0000 UTC m=+43.750005358" watchObservedRunningTime="2025-11-26 14:51:16.3256625 +0000 UTC m=+43.751410104" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.325733 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.332884 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:16 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:16 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:16 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.332933 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.343612 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.348244 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.848225977 +0000 UTC m=+44.273973581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.383654 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" event={"ID":"6defa317-08ba-4208-8537-f7ed45bc26e9","Type":"ContainerStarted","Data":"19c8310978a5bd447e255691ef673ad4dc83ddc56f3862f3294589b44fe39c09"} Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.398652 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kps8x" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.409737 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vw9bz" podStartSLOduration=18.409712507 podStartE2EDuration="18.409712507s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:16.398228309 +0000 UTC m=+43.823975933" watchObservedRunningTime="2025-11-26 14:51:16.409712507 +0000 UTC m=+43.835460111" Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.446538 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.446887 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:16.946875185 +0000 UTC m=+44.372622789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.555613 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.558359 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.058332704 +0000 UTC m=+44.484080358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.657235 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.657580 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.157566907 +0000 UTC m=+44.583314511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.729184 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-79fzh"] Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.764432 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.764707 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.264693034 +0000 UTC m=+44.690440638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.866059 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.866439 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.366423691 +0000 UTC m=+44.792171295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.977588 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.977724 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.477705706 +0000 UTC m=+44.903453310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:16 crc kubenswrapper[4651]: I1126 14:51:16.977988 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:16 crc kubenswrapper[4651]: E1126 14:51:16.978398 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.478388204 +0000 UTC m=+44.904135808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.078995 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.079114 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.579098774 +0000 UTC m=+45.004846378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.079344 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.079618 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.579608477 +0000 UTC m=+45.005356081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.181446 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.181738 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.681717145 +0000 UTC m=+45.107464749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.181902 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.182291 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.682277069 +0000 UTC m=+45.108024683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.284390 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.284797 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.784778587 +0000 UTC m=+45.210526191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.333232 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:17 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:17 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:17 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.333286 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.389099 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.389498 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.889483011 +0000 UTC m=+45.315230615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.419242 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" event={"ID":"6defa317-08ba-4208-8537-f7ed45bc26e9","Type":"ContainerStarted","Data":"dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.419281 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.433162 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" event={"ID":"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e","Type":"ContainerStarted","Data":"a1ad6662ae6f07a27ff7da1897d1a14ee26b5545dc746ec9b9605edaf6b7a14e"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.440787 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" event={"ID":"7efeda68-504a-457c-8576-15a4eb8ffc86","Type":"ContainerStarted","Data":"318e26f4e55cbd66425899284e2127c30a17d01b41037790adb2283e794cdf3b"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.458513 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" event={"ID":"8ba8185d-5551-4919-88c7-8d98a1a955b6","Type":"ContainerStarted","Data":"27995aacfaae9df314d79c35c0eed3be00248ea73389adf7ef20d06878a842a0"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.464859 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podStartSLOduration=7.464836731 podStartE2EDuration="7.464836731s" podCreationTimestamp="2025-11-26 14:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.447514471 +0000 UTC m=+44.873262085" watchObservedRunningTime="2025-11-26 14:51:17.464836731 +0000 UTC m=+44.890584335" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.486312 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79fzh" event={"ID":"46f059e4-ddf4-4e21-b528-0cc9cec8afa1","Type":"ContainerStarted","Data":"c13fa3453aeed34c8d6aa18ac11020c79f6210d5d2eafe20ab49364789bcedbf"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.488978 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pm87n" podStartSLOduration=19.488957618 podStartE2EDuration="19.488957618s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.4886402 +0000 UTC m=+44.914387814" watchObservedRunningTime="2025-11-26 14:51:17.488957618 +0000 UTC m=+44.914705222" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.489754 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" event={"ID":"2d474cd7-8d0f-40f2-b125-94c074eee3c2","Type":"ContainerStarted","Data":"2a398315bd45ee62cb36ca427a26e3083c2a37cb80a1a1213ae6321ad3f8b688"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.490029 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.490270 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.990239772 +0000 UTC m=+45.415987396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.490494 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.493590 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:17.993573419 +0000 UTC m=+45.419321013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.516363 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" event={"ID":"bb6d2ae9-7867-4995-97b5-33740c0de594","Type":"ContainerStarted","Data":"60942efc203064b97c3e2a0e8e82b3b019524b4e2a94313f02d95178b212a6c3"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.523754 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" event={"ID":"cebd2b5f-d730-498e-a779-ad053c71f5ff","Type":"ContainerStarted","Data":"8ac9f4a8c8db6c323fceb3cee7bfdddbca5a572659083f08149c4b1eb4f22565"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.534421 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" event={"ID":"af7dcbad-1236-40d4-9c4b-0fa57fb76df3","Type":"ContainerStarted","Data":"19b23fc3b17e116fcad2ee0a6e515ad329e9ceea3d8f0540958952442b182313"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.544371 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-n264l" podStartSLOduration=19.54434177 podStartE2EDuration="19.54434177s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.537586484 +0000 UTC m=+44.963334088" watchObservedRunningTime="2025-11-26 14:51:17.54434177 +0000 UTC m=+44.970089384" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.549875 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" event={"ID":"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76","Type":"ContainerStarted","Data":"c49e9381030a8919a0377bf9efcb0f900587c920bc533fad80dde9150102626f"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.549922 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" event={"ID":"fc4b7e2f-02e2-4fe2-bec0-7f74a0dfda76","Type":"ContainerStarted","Data":"11b47ce87e132b3c30b588c850260067e38c554bbbf2fb2c11096c4ce0c73973"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.585702 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rln7f" podStartSLOduration=19.585686006 podStartE2EDuration="19.585686006s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.584804553 +0000 UTC m=+45.010552177" watchObservedRunningTime="2025-11-26 14:51:17.585686006 +0000 UTC m=+45.011433610" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.591678 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.593087 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.093065128 +0000 UTC m=+45.518812732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.594634 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" event={"ID":"1bb12b15-c889-47bf-9e26-25196edb90e0","Type":"ContainerStarted","Data":"d8b447b32f174ee6aa6f3602c66ca956399df5d89d40809808e0408fdd7a3f3f"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.615219 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" event={"ID":"695d6bbc-9f78-4920-8186-a77d167378a9","Type":"ContainerStarted","Data":"e6cf5568af6d9b462adf45850066a5d0872043c5cf3f1c7b59ec19c333de0d57"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.618544 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" event={"ID":"e7c2a5af-4204-4822-bec4-8589813d80df","Type":"ContainerStarted","Data":"8856551c91ddb5862d4fc4bac4987d44aa279e7b13214858dd415f2a6c56da36"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.635077 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" event={"ID":"a52c4031-f252-41d9-9904-eb7e9f78d501","Type":"ContainerStarted","Data":"c1656c3d19046a76bb64fc32ca86652ef89fc2fa5ca49862559b007458d9d6d3"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.637427 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhmkw" podStartSLOduration=19.637406211 podStartE2EDuration="19.637406211s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.635544163 +0000 UTC m=+45.061291787" watchObservedRunningTime="2025-11-26 14:51:17.637406211 +0000 UTC m=+45.063153815" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.652151 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" event={"ID":"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e","Type":"ContainerStarted","Data":"01ae2f50932a09946a23bb139d81baa453a7b431ee2994610602906d2b6a5e28"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.678199 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" event={"ID":"df7a5934-8cbe-48de-badf-a0bf93119820","Type":"ContainerStarted","Data":"b1429c72868efafe39ddd92c13574b206bffa32fd7b342aa49524030714d9ed4"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.678257 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" event={"ID":"df7a5934-8cbe-48de-badf-a0bf93119820","Type":"ContainerStarted","Data":"199ba85e5a6894f59fbe499e46db929e4c97da114c4896bc7acf75d5c7eba0f1"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.693604 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.693983 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.193967653 +0000 UTC m=+45.619715257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.722663 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" event={"ID":"50268c7a-5457-412a-8233-f8045815e7bf","Type":"ContainerStarted","Data":"2d6b7987e19819cfaabb419e3e9d86970ea869a96835075a895f1365b016ef30"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.743289 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" podStartSLOduration=18.743272346 podStartE2EDuration="18.743272346s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.693335296 +0000 UTC m=+45.119082910" watchObservedRunningTime="2025-11-26 14:51:17.743272346 +0000 UTC m=+45.169019950" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.794474 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.796841 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.296822519 +0000 UTC m=+45.722570123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.847174 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nwdsz" event={"ID":"0405d04c-84c3-4ea7-8efe-9216684f0f97","Type":"ContainerStarted","Data":"d6077ce6ae86549d1eb5b3fd01587a3f8226ba7db601e7add79dfb8e9bbb252b"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.847232 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nwdsz" event={"ID":"0405d04c-84c3-4ea7-8efe-9216684f0f97","Type":"ContainerStarted","Data":"a3861cb6f3f2c169ab5b275acc0b83d02d0b8aef17d355bd1d23a9a918148049"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.872516 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" event={"ID":"97f4b33d-62ab-442b-a11a-2c62f88c3b80","Type":"ContainerStarted","Data":"5f6b712d4bc15d8216dec316bdcd2b393ed84a6b609ff8c80aca38616a772aed"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.904729 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:17 crc kubenswrapper[4651]: E1126 14:51:17.905058 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.405025445 +0000 UTC m=+45.830773049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.913242 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" event={"ID":"d6c80d47-92ec-4861-8936-289e6525a876","Type":"ContainerStarted","Data":"523d214db4775d936dbcd099349c9473f939e939aee47adb9c1ef1fb0d010448"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.957098 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-skz22" podStartSLOduration=19.957074779 podStartE2EDuration="19.957074779s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.745492114 +0000 UTC m=+45.171239718" watchObservedRunningTime="2025-11-26 14:51:17.957074779 +0000 UTC m=+45.382822393" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.957291 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" event={"ID":"415988cf-7b69-44c8-a978-2ba440a2196b","Type":"ContainerStarted","Data":"2fa0649c6eb08741168cc77be7c54b2a5d0de74e201be6777357725db229ede5"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.958228 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" event={"ID":"415988cf-7b69-44c8-a978-2ba440a2196b","Type":"ContainerStarted","Data":"2250ffe6c5f7c91e89d56a912c67f4b18223f5be2e1f442e3fc304f3a08275a3"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.961278 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" event={"ID":"7af18cbb-d68f-4d02-9556-62ea57ed250f","Type":"ContainerStarted","Data":"f79e6e0b0b182d488379d1fc4d6625bc1ac122e00c8b98db84b9ad68b5a4369a"} Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.962275 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.963239 4651 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rwlv8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.963276 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" podUID="7af18cbb-d68f-4d02-9556-62ea57ed250f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 26 14:51:17 crc kubenswrapper[4651]: I1126 14:51:17.978187 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" event={"ID":"4b814105-58ac-41b6-8b52-efa5de815233","Type":"ContainerStarted","Data":"d31c742ba06f3e3508975721c305daa18b5b5029b2b8526150271f018cb66764"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.005587 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.006789 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.506774042 +0000 UTC m=+45.932521646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.010799 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" event={"ID":"b19dd389-5b85-4864-b4d7-bcf3222b1061","Type":"ContainerStarted","Data":"51e03aa1eba75e111f0cf64568c112c7036a8089c5a7e0aea3bc1ebb9b9eec3b"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.010846 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" event={"ID":"b19dd389-5b85-4864-b4d7-bcf3222b1061","Type":"ContainerStarted","Data":"d15f244864ef59f36c91d0b1e2482bd759337c67c7ee7bdca37dddcb580fce7a"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.011451 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.037013 4651 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d2wqw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.037076 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" podUID="b19dd389-5b85-4864-b4d7-bcf3222b1061" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.039303 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nwdsz" podStartSLOduration=8.039277958 podStartE2EDuration="8.039277958s" podCreationTimestamp="2025-11-26 14:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:17.955915739 +0000 UTC m=+45.381663353" watchObservedRunningTime="2025-11-26 14:51:18.039277958 +0000 UTC m=+45.465025562" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.039937 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bl9fp" podStartSLOduration=20.039930965 podStartE2EDuration="20.039930965s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.03668529 +0000 UTC m=+45.462432894" watchObservedRunningTime="2025-11-26 14:51:18.039930965 +0000 UTC m=+45.465678569" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.083176 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" event={"ID":"d53b7ba5-49be-4aa3-87d6-89c74221cfda","Type":"ContainerStarted","Data":"9a9cb00e4ad083981f171f8f3fbd2ed111525e9d7fc97070c191d07681342735"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.108364 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.110504 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.61047716 +0000 UTC m=+46.036224854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.119807 4651 generic.go:334] "Generic (PLEG): container finished" podID="e69d02a9-477f-4281-bb15-469b21b21f7a" containerID="9a26c9b973ea3a2b06ab24751673a212ce703816d68b1f22c36259fd2cc06396" exitCode=0 Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.120601 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" event={"ID":"e69d02a9-477f-4281-bb15-469b21b21f7a","Type":"ContainerDied","Data":"9a26c9b973ea3a2b06ab24751673a212ce703816d68b1f22c36259fd2cc06396"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.149565 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" podStartSLOduration=19.149547517 podStartE2EDuration="19.149547517s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.147953486 +0000 UTC m=+45.573701100" watchObservedRunningTime="2025-11-26 14:51:18.149547517 +0000 UTC m=+45.575295131" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.183614 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96wpb" event={"ID":"841bb384-9633-40c9-883b-109faa7e681d","Type":"ContainerStarted","Data":"9932514bb87150f4e44018234627444d0c64c80532a8150e189dfb37cf43bad3"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.197781 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" event={"ID":"5c36becc-6886-4b68-8a62-9a857bd09359","Type":"ContainerStarted","Data":"209cdb6ad0d8492a077a7b9e313eed2498f667404fb3aceb7cd6bb560e0aae60"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.197826 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" event={"ID":"5c36becc-6886-4b68-8a62-9a857bd09359","Type":"ContainerStarted","Data":"41d42bb60f640cfab5b8215a245e24d4bb6b2d8cf8685496b8fbb0d1a4169565"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.209364 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.210495 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.710479532 +0000 UTC m=+46.136227136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.211560 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" event={"ID":"7e11728f-727f-4ed6-8c97-7d3383fb0db1","Type":"ContainerStarted","Data":"ee7d3b04b8ab49d10626b760e0787d264343db7475478bd4ddd638b41fc09b14"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.220850 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" event={"ID":"b1e02d51-3be7-4c15-9e50-f446bca05403","Type":"ContainerStarted","Data":"f3a2cd004e7307fa51f963c8bee460c22083fe214878ff9079e0d8328555b337"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.228953 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.229126 4651 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jgc22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.229162 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.263763 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" event={"ID":"c9862b5b-24f3-41bc-aae6-36f911cf57a0","Type":"ContainerStarted","Data":"0b1810521797593c25c11ae9695bc72597a37d5b802aed92a6cf74a89f2d472d"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.263801 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" event={"ID":"c9862b5b-24f3-41bc-aae6-36f911cf57a0","Type":"ContainerStarted","Data":"4167f38cc79a67956ead40af7b8aca41106868082e3e718a30325a7a4a6980a9"} Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.277673 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.281677 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.311859 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.314598 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.814585481 +0000 UTC m=+46.240333075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.327616 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:18 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:18 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:18 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.327666 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.364236 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" podStartSLOduration=19.364214753 podStartE2EDuration="19.364214753s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.281417668 +0000 UTC m=+45.707165272" watchObservedRunningTime="2025-11-26 14:51:18.364214753 +0000 UTC m=+45.789962357" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.412527 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.413985 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:18.913969717 +0000 UTC m=+46.339717321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.466926 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-skjk9" podStartSLOduration=20.466908575 podStartE2EDuration="20.466908575s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.374388378 +0000 UTC m=+45.800135992" watchObservedRunningTime="2025-11-26 14:51:18.466908575 +0000 UTC m=+45.892656169" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.516735 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.517095 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.01708224 +0000 UTC m=+46.442829844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.621507 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.621851 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.121836296 +0000 UTC m=+46.547583900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.717696 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-96wpb" podStartSLOduration=8.717678079 podStartE2EDuration="8.717678079s" podCreationTimestamp="2025-11-26 14:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.678411578 +0000 UTC m=+46.104159182" watchObservedRunningTime="2025-11-26 14:51:18.717678079 +0000 UTC m=+46.143425673" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.725633 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.725980 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.225970125 +0000 UTC m=+46.651717729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.756140 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-96sw4" podStartSLOduration=20.75611973 podStartE2EDuration="20.75611973s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.754630061 +0000 UTC m=+46.180377665" watchObservedRunningTime="2025-11-26 14:51:18.75611973 +0000 UTC m=+46.181867334" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.757583 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" podStartSLOduration=19.757575997 podStartE2EDuration="19.757575997s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.719282131 +0000 UTC m=+46.145029735" watchObservedRunningTime="2025-11-26 14:51:18.757575997 +0000 UTC m=+46.183323601" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.800618 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podStartSLOduration=20.800602947 podStartE2EDuration="20.800602947s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:18.800364351 +0000 UTC m=+46.226111945" watchObservedRunningTime="2025-11-26 14:51:18.800602947 +0000 UTC m=+46.226350551" Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.826580 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.826760 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.326734957 +0000 UTC m=+46.752482561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.826874 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.827133 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.327122067 +0000 UTC m=+46.752869671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:18 crc kubenswrapper[4651]: I1126 14:51:18.927838 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:18 crc kubenswrapper[4651]: E1126 14:51:18.928208 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.428192987 +0000 UTC m=+46.853940591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.029645 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.030199 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.530186771 +0000 UTC m=+46.955934365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.130565 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.130762 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.630745618 +0000 UTC m=+47.056493222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.232233 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.232660 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.732644689 +0000 UTC m=+47.158392293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.282857 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" event={"ID":"415988cf-7b69-44c8-a978-2ba440a2196b","Type":"ContainerStarted","Data":"6d2aa6b4e7c1a7585e071732c803d798aeb67447d09f894cac8401a0acd5cfc5"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.290388 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79fzh" event={"ID":"46f059e4-ddf4-4e21-b528-0cc9cec8afa1","Type":"ContainerStarted","Data":"90c987941f88c493f579066370ab8a7c4613897376b3bfb85250709291b1b634"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.293550 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" event={"ID":"7af18cbb-d68f-4d02-9556-62ea57ed250f","Type":"ContainerStarted","Data":"99e2f1d5b0742ac97cbc5651fc7fc910a6dc850a3d80a1dbbc247ff07cdfc493"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.294970 4651 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rwlv8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.295259 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" podUID="7af18cbb-d68f-4d02-9556-62ea57ed250f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.298434 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" event={"ID":"1bb12b15-c889-47bf-9e26-25196edb90e0","Type":"ContainerStarted","Data":"4f0848b18d210a981120371a732ef88b2365637201679d21b9e050f1a545e3c9"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.312475 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" event={"ID":"cebd2b5f-d730-498e-a779-ad053c71f5ff","Type":"ContainerStarted","Data":"f4ce86c804e79a17e33fea89529018dbc87e8159b372ff27f01fe38ce0366b6c"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.312551 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.318308 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" event={"ID":"97f4b33d-62ab-442b-a11a-2c62f88c3b80","Type":"ContainerStarted","Data":"157b1bed242fb3074b34409ce6f70c969872a4b04aabbcb3624ac33e8269cad4"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.323238 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:19 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:19 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:19 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.323285 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.328787 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" event={"ID":"7e11728f-727f-4ed6-8c97-7d3383fb0db1","Type":"ContainerStarted","Data":"ad36b15261e1a59a4ac4d919c41a052477a3b67114b0267ee384ec2087b04cc8"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.328831 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" event={"ID":"7e11728f-727f-4ed6-8c97-7d3383fb0db1","Type":"ContainerStarted","Data":"9fb8010432cd7c4d79e6173aa3ddbe728743432af87fad810cd7e0cbd398e6a7"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.333543 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.333881 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.833867622 +0000 UTC m=+47.259615216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.346916 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-br4rv" event={"ID":"59ae7330-c38e-426b-8781-4115ebed8c71","Type":"ContainerStarted","Data":"a38b81a0d24cd80bd1b6a6f7ccd9617dc55a82ed6aabea8ec042f2ef8f3953c2"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.346955 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-br4rv" event={"ID":"59ae7330-c38e-426b-8781-4115ebed8c71","Type":"ContainerStarted","Data":"33c4029fbb60fab23328f8168809b26af8dbd1c6e6c6a424b2d8a5bae9590098"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.347477 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.360106 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" event={"ID":"9c5376be-3ddd-4168-aed7-8ea2bc1fc97e","Type":"ContainerStarted","Data":"da11d3af1a2cc2ccb145bb7f23b8cad3e481694c61cb62f17e398cb8962ccc5e"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.361664 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" event={"ID":"a52c4031-f252-41d9-9904-eb7e9f78d501","Type":"ContainerStarted","Data":"bb9d733483cb0c767cc8bfe74aa6daf086cdaba35d6ba0c45afe4e2e762dff87"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.372910 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" event={"ID":"af7dcbad-1236-40d4-9c4b-0fa57fb76df3","Type":"ContainerStarted","Data":"65c7c3ffb817f440acfbbe23ecbbabdbd7e717a01cfab1247eb46f191028b6b6"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.372974 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.374564 4651 patch_prober.go:28] interesting pod/console-operator-58897d9998-c6sbq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.374618 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" podUID="af7dcbad-1236-40d4-9c4b-0fa57fb76df3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.384565 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" event={"ID":"50268c7a-5457-412a-8233-f8045815e7bf","Type":"ContainerStarted","Data":"5836734a37f4d5832cee0b85cc4d3a9e8c45b1ccecaca1be5d49d9ad6c22d8d4"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.395936 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" podStartSLOduration=20.395917797 podStartE2EDuration="20.395917797s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.393520494 +0000 UTC m=+46.819268108" watchObservedRunningTime="2025-11-26 14:51:19.395917797 +0000 UTC m=+46.821665401" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.396280 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s4nz5" podStartSLOduration=21.396276216 podStartE2EDuration="21.396276216s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.336672515 +0000 UTC m=+46.762420139" watchObservedRunningTime="2025-11-26 14:51:19.396276216 +0000 UTC m=+46.822023820" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.395021 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" event={"ID":"d53b7ba5-49be-4aa3-87d6-89c74221cfda","Type":"ContainerStarted","Data":"b5a460b018d40cfec6f0b68ad81b1228dd9d4309ea1e90cc25aeb342ba9d360b"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.420655 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" event={"ID":"990e931d-e194-4eb0-91dc-d2d7ea0f8d4e","Type":"ContainerStarted","Data":"212cf802d69e1d6d3ab58fa4295586642b1d035064ea61488437f5b97e8efd9b"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.420702 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" event={"ID":"e69d02a9-477f-4281-bb15-469b21b21f7a","Type":"ContainerStarted","Data":"48b15cf938538681e1853eb21ffa2c264a7a7411c52d109e7465a1edbf0f6966"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.439581 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.441202 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:19.941190544 +0000 UTC m=+47.366938148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.441819 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" event={"ID":"695d6bbc-9f78-4920-8186-a77d167378a9","Type":"ContainerStarted","Data":"f4ce9d64d0483f0d47816da226391e41eccfcd3bf02ee5e68c6969eca37b7618"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.465344 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" event={"ID":"b1e02d51-3be7-4c15-9e50-f446bca05403","Type":"ContainerStarted","Data":"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e"} Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.469443 4651 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jgc22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.469499 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.470219 4651 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d2wqw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.470261 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" podUID="b19dd389-5b85-4864-b4d7-bcf3222b1061" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.470570 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.481255 4651 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4xztb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.481330 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" podUID="c9862b5b-24f3-41bc-aae6-36f911cf57a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.489840 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4wgpt" podStartSLOduration=21.48981857 podStartE2EDuration="21.48981857s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.458155896 +0000 UTC m=+46.883903500" watchObservedRunningTime="2025-11-26 14:51:19.48981857 +0000 UTC m=+46.915566234" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.540546 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.541925 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.041910205 +0000 UTC m=+47.467657809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.567582 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.584580 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bh4kq" podStartSLOduration=21.584560996 podStartE2EDuration="21.584560996s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.579093423 +0000 UTC m=+47.004841047" watchObservedRunningTime="2025-11-26 14:51:19.584560996 +0000 UTC m=+47.010308600" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.643022 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.646711 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.146688382 +0000 UTC m=+47.572436096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.721296 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zl5lt" podStartSLOduration=21.721272782 podStartE2EDuration="21.721272782s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.677867983 +0000 UTC m=+47.103615587" watchObservedRunningTime="2025-11-26 14:51:19.721272782 +0000 UTC m=+47.147020396" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.744085 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.744472 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.244458156 +0000 UTC m=+47.670205760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.780507 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmtmr" podStartSLOduration=20.780487793 podStartE2EDuration="20.780487793s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.72464199 +0000 UTC m=+47.150389594" watchObservedRunningTime="2025-11-26 14:51:19.780487793 +0000 UTC m=+47.206235397" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.845731 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.846136 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.346120031 +0000 UTC m=+47.771867625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.884546 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c6kjm" podStartSLOduration=21.88452609 podStartE2EDuration="21.88452609s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.781746076 +0000 UTC m=+47.207493680" watchObservedRunningTime="2025-11-26 14:51:19.88452609 +0000 UTC m=+47.310273694" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.886670 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-br4rv" podStartSLOduration=9.886663115 podStartE2EDuration="9.886663115s" podCreationTimestamp="2025-11-26 14:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.882429006 +0000 UTC m=+47.308176630" watchObservedRunningTime="2025-11-26 14:51:19.886663115 +0000 UTC m=+47.312410719" Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.947027 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.947158 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.447131209 +0000 UTC m=+47.872878813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:19 crc kubenswrapper[4651]: I1126 14:51:19.947278 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:19 crc kubenswrapper[4651]: E1126 14:51:19.947606 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.447595351 +0000 UTC m=+47.873342955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.042887 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" podStartSLOduration=22.04286603 podStartE2EDuration="22.04286603s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:19.989948513 +0000 UTC m=+47.415696117" watchObservedRunningTime="2025-11-26 14:51:20.04286603 +0000 UTC m=+47.468613634" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.048209 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.048379 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.548352403 +0000 UTC m=+47.974100007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.048468 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.048781 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.548774003 +0000 UTC m=+47.974521607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.150748 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.151007 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.650981353 +0000 UTC m=+48.076728957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.195227 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pp9mp" podStartSLOduration=21.195203884 podStartE2EDuration="21.195203884s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.057591493 +0000 UTC m=+47.483339097" watchObservedRunningTime="2025-11-26 14:51:20.195203884 +0000 UTC m=+47.620951498" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.196131 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" podStartSLOduration=22.196122298 podStartE2EDuration="22.196122298s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.194398152 +0000 UTC m=+47.620145776" watchObservedRunningTime="2025-11-26 14:51:20.196122298 +0000 UTC m=+47.621869902" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.251857 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.252257 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.752241758 +0000 UTC m=+48.177989362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.321282 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:20 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:20 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:20 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.321348 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.344172 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xfr4" podStartSLOduration=22.34415298 podStartE2EDuration="22.34415298s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.342295941 +0000 UTC m=+47.768043545" watchObservedRunningTime="2025-11-26 14:51:20.34415298 +0000 UTC m=+47.769900584" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.352771 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.352949 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.852923908 +0000 UTC m=+48.278671512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.353057 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.353478 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.853462481 +0000 UTC m=+48.279210085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.454135 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.454318 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.954293965 +0000 UTC m=+48.380041579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.454519 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.454868 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:20.954851999 +0000 UTC m=+48.380599603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.471130 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79fzh" event={"ID":"46f059e4-ddf4-4e21-b528-0cc9cec8afa1","Type":"ContainerStarted","Data":"60b41cd98a0519a99d74a019d10c360d74449893120b0a24dac0e9f98a10b2b7"} Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.474228 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" event={"ID":"e69d02a9-477f-4281-bb15-469b21b21f7a","Type":"ContainerStarted","Data":"1932c953cb3459c76efcd70c1628c18681b54c4eb88f37b5c34855f8ef4265ce"} Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.480582 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" event={"ID":"2d474cd7-8d0f-40f2-b125-94c074eee3c2","Type":"ContainerStarted","Data":"1db14996f16fcceac878a33ee268c5b07379e27cd7b72ec4cd7745d453467707"} Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.481278 4651 patch_prober.go:28] interesting pod/console-operator-58897d9998-c6sbq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.481916 4651 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jgc22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.481973 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.481320 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" podUID="af7dcbad-1236-40d4-9c4b-0fa57fb76df3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/readyz\": dial tcp 10.217.0.38:8443: connect: connection refused" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.519125 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4xztb" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.531676 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rwlv8" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.561649 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.564652 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.064622575 +0000 UTC m=+48.490370249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.579601 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-b92mn" podStartSLOduration=21.579581555 podStartE2EDuration="21.579581555s" podCreationTimestamp="2025-11-26 14:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.472761856 +0000 UTC m=+47.898509480" watchObservedRunningTime="2025-11-26 14:51:20.579581555 +0000 UTC m=+48.005329169" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.664879 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.665249 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.165234473 +0000 UTC m=+48.590982077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.765528 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.765726 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.265708408 +0000 UTC m=+48.691456012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.765912 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.766235 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.266227571 +0000 UTC m=+48.691975165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.866921 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.867100 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.367073035 +0000 UTC m=+48.792820639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.867151 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.867495 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.367483586 +0000 UTC m=+48.793231190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.892771 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-79fzh" podStartSLOduration=22.892751014 podStartE2EDuration="22.892751014s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.581244088 +0000 UTC m=+48.006991682" watchObservedRunningTime="2025-11-26 14:51:20.892751014 +0000 UTC m=+48.318498618" Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.894799 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jzbt6"] Nov 26 14:51:20 crc kubenswrapper[4651]: I1126 14:51:20.968269 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:20 crc kubenswrapper[4651]: E1126 14:51:20.968638 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.468621828 +0000 UTC m=+48.894369432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.069985 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.070313 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.570301273 +0000 UTC m=+48.996048877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.101479 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" podStartSLOduration=23.101456504 podStartE2EDuration="23.101456504s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:20.93334328 +0000 UTC m=+48.359090894" watchObservedRunningTime="2025-11-26 14:51:21.101456504 +0000 UTC m=+48.527204108" Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.171792 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.172119 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.672087321 +0000 UTC m=+49.097834925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.172274 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.172700 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.672692887 +0000 UTC m=+49.098440491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.273769 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.274142 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.774127447 +0000 UTC m=+49.199875051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.321024 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:21 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:21 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:21 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.321449 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.375249 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.375659 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.875639448 +0000 UTC m=+49.301387132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.476452 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.476744 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:21.976729368 +0000 UTC m=+49.402476972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.496871 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" gracePeriod=30 Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.497257 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" event={"ID":"2d474cd7-8d0f-40f2-b125-94c074eee3c2","Type":"ContainerStarted","Data":"f3e0cc2c28bea36ea83b6bfbafa3ee18b68aefad28301366104d4f432b336433"} Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.497310 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" event={"ID":"2d474cd7-8d0f-40f2-b125-94c074eee3c2","Type":"ContainerStarted","Data":"bd38f878dbffeb142338b7af1d5429e3897277b18cb2c07638de9d275db4a03f"} Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.499630 4651 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jgc22 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.499657 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.578107 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.580229 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.080212931 +0000 UTC m=+49.505960535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.679188 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.679330 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.179309809 +0000 UTC m=+49.605057413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.679475 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.679800 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.179789321 +0000 UTC m=+49.605536925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.780388 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.780578 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.280552193 +0000 UTC m=+49.706299797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.780691 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.780987 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.280976825 +0000 UTC m=+49.706724499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.881971 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.882390 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.382374643 +0000 UTC m=+49.808122237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:21 crc kubenswrapper[4651]: I1126 14:51:21.983195 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:21 crc kubenswrapper[4651]: E1126 14:51:21.983555 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.483543675 +0000 UTC m=+49.909291279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.084707 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.084988 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.584961294 +0000 UTC m=+50.010708898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.085445 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.085774 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.585758304 +0000 UTC m=+50.011505908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.186934 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.187093 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.687075001 +0000 UTC m=+50.112822605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.187225 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.187555 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.687546744 +0000 UTC m=+50.113294348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.266818 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.266872 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.280267 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.287856 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.288062 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.788021998 +0000 UTC m=+50.213769602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.288195 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.288520 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.788511831 +0000 UTC m=+50.214259435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.322088 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:22 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:22 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:22 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.322167 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.388852 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.389769 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.889739504 +0000 UTC m=+50.315487108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.433002 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.433894 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.435836 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.445423 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.490128 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dk2f\" (UniqueName: \"kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.490198 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.490253 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.490276 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.490606 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:22.990585858 +0000 UTC m=+50.416333602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.503176 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" event={"ID":"2d474cd7-8d0f-40f2-b125-94c074eee3c2","Type":"ContainerStarted","Data":"a43e6b4242661920a37fff92096b555e45bc499a74369b781b17fad691c59efe"} Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.509516 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j7sjm" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.545904 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.545953 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.546171 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.546209 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.565958 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2jn8n" podStartSLOduration=12.565940989 podStartE2EDuration="12.565940989s" podCreationTimestamp="2025-11-26 14:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:22.53141197 +0000 UTC m=+49.957159574" watchObservedRunningTime="2025-11-26 14:51:22.565940989 +0000 UTC m=+49.991688583" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.591273 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.591474 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.591507 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.591557 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dk2f\" (UniqueName: \"kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.591888 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.091872334 +0000 UTC m=+50.517619938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.602409 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.602687 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.636172 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dk2f\" (UniqueName: \"kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f\") pod \"certified-operators-bkcjt\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.639675 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.640589 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.647067 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.654076 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.654708 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.657451 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.657586 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.673326 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.692844 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.693167 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.193156389 +0000 UTC m=+50.618903983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.734576 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.747286 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.802539 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.802725 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.302696889 +0000 UTC m=+50.728444493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.802991 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.803048 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94lrw\" (UniqueName: \"kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.803079 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.803110 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.803133 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.803269 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.803384 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.303372587 +0000 UTC m=+50.729120191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.854608 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.855293 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.856094 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.856190 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.856368 4651 patch_prober.go:28] interesting pod/console-f9d7485db-q4qzb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.856411 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q4qzb" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.883162 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.883223 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904694 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904851 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94lrw\" (UniqueName: \"kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904880 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904914 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904958 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.904995 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.905429 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: E1126 14:51:22.905508 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.405492194 +0000 UTC m=+50.831239798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.905764 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.906118 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.966366 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94lrw\" (UniqueName: \"kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw\") pod \"community-operators-vqpd2\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.966620 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.976050 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:51:22 crc kubenswrapper[4651]: I1126 14:51:22.979954 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.008322 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.008382 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5h2z\" (UniqueName: \"kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.008406 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.008443 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.009982 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.509967412 +0000 UTC m=+50.935715106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.041017 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.041918 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.109940 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.114305 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.614277367 +0000 UTC m=+51.040024971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114556 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114669 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114720 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nx55\" (UniqueName: \"kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114809 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114858 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5h2z\" (UniqueName: \"kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.114880 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.115309 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.115518 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.115768 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.615758725 +0000 UTC m=+51.041506329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.124232 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.124452 4651 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.153938 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5h2z\" (UniqueName: \"kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z\") pod \"certified-operators-8822d\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.173866 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.217477 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.217794 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.217830 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.217848 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nx55\" (UniqueName: \"kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.218364 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.718332404 +0000 UTC m=+51.144080008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.218759 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.218783 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.273406 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.288987 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nx55\" (UniqueName: \"kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55\") pod \"community-operators-j4n5k\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.317620 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.318710 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.319302 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.819290781 +0000 UTC m=+51.245038385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.327941 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:23 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:23 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:23 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.327995 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.387618 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.419744 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.420821 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:23.920806082 +0000 UTC m=+51.346553686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.522754 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.532614 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:24.032600301 +0000 UTC m=+51.458347905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.634980 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.635797 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:24.135782876 +0000 UTC m=+51.561530480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.736302 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.736587 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:24.236575638 +0000 UTC m=+51.662323232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.789553 4651 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mgxls container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]log ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]etcd ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/max-in-flight-filter ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 26 14:51:23 crc kubenswrapper[4651]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 26 14:51:23 crc kubenswrapper[4651]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/project.openshift.io-projectcache ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/openshift.io-startinformers ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 26 14:51:23 crc kubenswrapper[4651]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 14:51:23 crc kubenswrapper[4651]: livez check failed Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.789621 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" podUID="e69d02a9-477f-4281-bb15-469b21b21f7a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.825402 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.840556 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.840865 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 14:51:24.340850061 +0000 UTC m=+51.766597675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: W1126 14:51:23.874175 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13d4652_4c12_40d7_bb77_edb7ce43bd47.slice/crio-37f40655cd0d1e9a78fe70e617816500d55286cd76c306981778c9eb3f0bb5f9 WatchSource:0}: Error finding container 37f40655cd0d1e9a78fe70e617816500d55286cd76c306981778c9eb3f0bb5f9: Status 404 returned error can't find the container with id 37f40655cd0d1e9a78fe70e617816500d55286cd76c306981778c9eb3f0bb5f9 Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.944164 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:23 crc kubenswrapper[4651]: E1126 14:51:23.944821 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 14:51:24.444810046 +0000 UTC m=+51.870557650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bb2l7" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.951652 4651 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-26T14:51:23.124484112Z","Handler":null,"Name":""} Nov 26 14:51:23 crc kubenswrapper[4651]: I1126 14:51:23.954685 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d2wqw" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.001805 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.011231 4651 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.011269 4651 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.045864 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.058493 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 14:51:24 crc kubenswrapper[4651]: E1126 14:51:24.058644 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.085428 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.099832 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:51:24 crc kubenswrapper[4651]: E1126 14:51:24.108841 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:24 crc kubenswrapper[4651]: E1126 14:51:24.111427 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:24 crc kubenswrapper[4651]: E1126 14:51:24.111533 4651 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.142028 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.148245 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.219732 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.264772 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-c6sbq" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.322795 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:24 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:24 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:24 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.323255 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.357103 4651 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.357195 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.431731 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.432619 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.437852 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.446371 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.477434 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.478460 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.489444 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.489739 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.510284 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.545151 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerStarted","Data":"84285533d09d8f7e4977ef0c726e0fd19d80179f6755901a094a02a9bf25d1ae"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.546207 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerStarted","Data":"a10833ce7b060a83f49d4bc0b549b57d3c0c79f19147e357606b36a7761286f7"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.547217 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerStarted","Data":"8bba5666ef6e9d436b1cdd562e1feb144b56adc586aa1e12d3968d892e2727b0"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.547978 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415","Type":"ContainerStarted","Data":"f4809b00384721775789ff50d03f086fe79c5992dfb9b23f9edb44eafa5b547c"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.549540 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerStarted","Data":"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.549564 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerStarted","Data":"37f40655cd0d1e9a78fe70e617816500d55286cd76c306981778c9eb3f0bb5f9"} Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.564433 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.564507 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.564525 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h87\" (UniqueName: \"kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.564551 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.564589 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.578324 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bb2l7\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.605811 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.665262 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.665347 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.665369 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h87\" (UniqueName: \"kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.665406 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.665480 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.666067 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.666448 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.666893 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.687262 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h87\" (UniqueName: \"kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87\") pod \"redhat-marketplace-kkzc8\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.706602 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.744590 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.830183 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.836568 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.837453 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.936398 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.947828 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.971337 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.971420 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:24 crc kubenswrapper[4651]: I1126 14:51:24.971493 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmplg\" (UniqueName: \"kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:24 crc kubenswrapper[4651]: W1126 14:51:24.979283 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ff5e03_1863_4dad_bc3a_9c21d0521b17.slice/crio-47729edd6ceb4e91de49632deb5aefc4016e22dd0225e752ac56c24f4828dce5 WatchSource:0}: Error finding container 47729edd6ceb4e91de49632deb5aefc4016e22dd0225e752ac56c24f4828dce5: Status 404 returned error can't find the container with id 47729edd6ceb4e91de49632deb5aefc4016e22dd0225e752ac56c24f4828dce5 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.005613 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.005787 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.046690 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.074315 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.074351 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.074380 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmplg\" (UniqueName: \"kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.075018 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.075323 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.110941 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmplg\" (UniqueName: \"kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg\") pod \"redhat-marketplace-j2h89\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.119084 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.257722 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 14:51:25 crc kubenswrapper[4651]: W1126 14:51:25.274998 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6df8db8a_1258_4f76_8131_cbba03509cb2.slice/crio-bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac WatchSource:0}: Error finding container bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac: Status 404 returned error can't find the container with id bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.287926 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.324955 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:25 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:25 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:25 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.325007 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.411891 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.554323 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.557260 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6df8db8a-1258-4f76-8131-cbba03509cb2","Type":"ContainerStarted","Data":"bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.559087 4651 generic.go:334] "Generic (PLEG): container finished" podID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerID="6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.559355 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerDied","Data":"6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.561907 4651 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.573992 4651 generic.go:334] "Generic (PLEG): container finished" podID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerID="1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.574189 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerDied","Data":"1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.577358 4651 generic.go:334] "Generic (PLEG): container finished" podID="dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" containerID="2c9a5e0405009db448ddce50d98535f6540599d1ed9f61c9ea07db8a64ca04ab" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.577426 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415","Type":"ContainerDied","Data":"2c9a5e0405009db448ddce50d98535f6540599d1ed9f61c9ea07db8a64ca04ab"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.585861 4651 generic.go:334] "Generic (PLEG): container finished" podID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerID="9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.586211 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerDied","Data":"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.601272 4651 generic.go:334] "Generic (PLEG): container finished" podID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerID="95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.601358 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerDied","Data":"95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.612120 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" event={"ID":"a6ff5e03-1863-4dad-bc3a-9c21d0521b17","Type":"ContainerStarted","Data":"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.615518 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.615543 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" event={"ID":"a6ff5e03-1863-4dad-bc3a-9c21d0521b17","Type":"ContainerStarted","Data":"47729edd6ceb4e91de49632deb5aefc4016e22dd0225e752ac56c24f4828dce5"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.624451 4651 generic.go:334] "Generic (PLEG): container finished" podID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerID="91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439" exitCode=0 Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.625341 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerDied","Data":"91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.625371 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerStarted","Data":"3380af0360ccaf733c95b01f773be028d45106ae53eed47b953c2efe55a272ec"} Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.638748 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.639812 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.650611 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.670333 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.770854 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" podStartSLOduration=27.770835498 podStartE2EDuration="27.770835498s" podCreationTimestamp="2025-11-26 14:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:25.769639236 +0000 UTC m=+53.195386850" watchObservedRunningTime="2025-11-26 14:51:25.770835498 +0000 UTC m=+53.196583102" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.789533 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.790128 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.790241 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4n24\" (UniqueName: \"kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.891262 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.891346 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.891408 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4n24\" (UniqueName: \"kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.891805 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.892105 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:25 crc kubenswrapper[4651]: I1126 14:51:25.910902 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4n24\" (UniqueName: \"kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24\") pod \"redhat-operators-qglps\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.017216 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.030969 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.032231 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.052749 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096090 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096154 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096183 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096228 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jbcz\" (UniqueName: \"kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096265 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096362 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.096387 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.097838 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.101700 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.106529 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.107072 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.134675 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.162593 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.171873 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.197647 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.197723 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.197764 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jbcz\" (UniqueName: \"kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.203696 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.213468 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.223614 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jbcz\" (UniqueName: \"kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz\") pod \"redhat-operators-4gpc7\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.326298 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:26 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:26 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:26 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.326345 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.397567 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.534736 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.641798 4651 generic.go:334] "Generic (PLEG): container finished" podID="24d8e3c8-6e05-46be-9e58-349533485f18" containerID="2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae" exitCode=0 Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.641851 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerDied","Data":"2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae"} Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.641944 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerStarted","Data":"8581077f77b5c91b67c5405426072e8bb1416a4c9a3dc46c158e4f158518f64a"} Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.661513 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerStarted","Data":"973d1969a650eaca1b0b3d156626b8bad02c3daf6a7fc15c8bae642c1ba8290b"} Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.765468 4651 generic.go:334] "Generic (PLEG): container finished" podID="6df8db8a-1258-4f76-8131-cbba03509cb2" containerID="e61a71c0e272aa3d139a6a65772d77878013b2be7994c12bae8d1605afa9b59e" exitCode=0 Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.765992 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6df8db8a-1258-4f76-8131-cbba03509cb2","Type":"ContainerDied","Data":"e61a71c0e272aa3d139a6a65772d77878013b2be7994c12bae8d1605afa9b59e"} Nov 26 14:51:26 crc kubenswrapper[4651]: I1126 14:51:26.811202 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.243973 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.317480 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access\") pod \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.324165 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir\") pod \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\" (UID: \"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415\") " Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.324539 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" (UID: "dd29a15c-636b-4ed9-ae3d-e1b9f2e41415"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.331290 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:27 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:27 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:27 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.331348 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.331290 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" (UID: "dd29a15c-636b-4ed9-ae3d-e1b9f2e41415"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.426175 4651 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.426210 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd29a15c-636b-4ed9-ae3d-e1b9f2e41415-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.783228 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2b1d41e70c82691c7182fda57b735ed11a37bc4283c1b62f020d71b0403dabf1"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.783272 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"183ba8ea665ae361c8618f143434cc985c6eb6d7ff2f10a8c36fad452b4151a9"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.784021 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.797341 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be90d6b5723e27d5d8cd4aa3c71cfed10953a09cce8a5ae1bedc0c3da4de9883"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.797367 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"460390458c685fe4db2b73ebf3c545f938d60acc25bf6c9a309956e1ff890640"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.821753 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd29a15c-636b-4ed9-ae3d-e1b9f2e41415","Type":"ContainerDied","Data":"f4809b00384721775789ff50d03f086fe79c5992dfb9b23f9edb44eafa5b547c"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.821823 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4809b00384721775789ff50d03f086fe79c5992dfb9b23f9edb44eafa5b547c" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.821781 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.855452 4651 generic.go:334] "Generic (PLEG): container finished" podID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerID="a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3" exitCode=0 Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.855560 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerDied","Data":"a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.861507 4651 generic.go:334] "Generic (PLEG): container finished" podID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerID="a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814" exitCode=0 Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.861603 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerDied","Data":"a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.861645 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerStarted","Data":"f3a5ad313dad5a3a145f2e73eeeccd33882692dc0eb01e3d79f43653a0316733"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.871905 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"344d4f7e5c5621ebf1e9aef475dc81c3296e891bb582bdf64d1d0f4329db3142"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.871958 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fc0e62993133dc90a6009ca012ce7dee6009599c3ba64b6ef1b53a95220995ae"} Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.891524 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:27 crc kubenswrapper[4651]: I1126 14:51:27.908296 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mgxls" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.157804 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.177500 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.324517 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:28 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:28 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:28 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.324569 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.487228 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.536685 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.536671843 podStartE2EDuration="536.671843ms" podCreationTimestamp="2025-11-26 14:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:28.533548542 +0000 UTC m=+55.959296166" watchObservedRunningTime="2025-11-26 14:51:28.536671843 +0000 UTC m=+55.962419447" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.558484 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access\") pod \"6df8db8a-1258-4f76-8131-cbba03509cb2\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.558542 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir\") pod \"6df8db8a-1258-4f76-8131-cbba03509cb2\" (UID: \"6df8db8a-1258-4f76-8131-cbba03509cb2\") " Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.558826 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6df8db8a-1258-4f76-8131-cbba03509cb2" (UID: "6df8db8a-1258-4f76-8131-cbba03509cb2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.571312 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6df8db8a-1258-4f76-8131-cbba03509cb2" (UID: "6df8db8a-1258-4f76-8131-cbba03509cb2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.661831 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df8db8a-1258-4f76-8131-cbba03509cb2-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.661861 4651 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6df8db8a-1258-4f76-8131-cbba03509cb2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.959373 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6df8db8a-1258-4f76-8131-cbba03509cb2","Type":"ContainerDied","Data":"bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac"} Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.959406 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbaf701045823e7ef04947958985785b6da8f47f4db054d0ec4d625edf36b7ac" Nov 26 14:51:28 crc kubenswrapper[4651]: I1126 14:51:28.959446 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 14:51:29 crc kubenswrapper[4651]: I1126 14:51:29.022448 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-br4rv" Nov 26 14:51:29 crc kubenswrapper[4651]: I1126 14:51:29.321683 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:29 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:29 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:29 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:29 crc kubenswrapper[4651]: I1126 14:51:29.321999 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:30 crc kubenswrapper[4651]: I1126 14:51:30.321988 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:30 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:30 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:30 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:30 crc kubenswrapper[4651]: I1126 14:51:30.322099 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:31 crc kubenswrapper[4651]: I1126 14:51:31.039307 4651 generic.go:334] "Generic (PLEG): container finished" podID="695d6bbc-9f78-4920-8186-a77d167378a9" containerID="f4ce9d64d0483f0d47816da226391e41eccfcd3bf02ee5e68c6969eca37b7618" exitCode=0 Nov 26 14:51:31 crc kubenswrapper[4651]: I1126 14:51:31.039380 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" event={"ID":"695d6bbc-9f78-4920-8186-a77d167378a9","Type":"ContainerDied","Data":"f4ce9d64d0483f0d47816da226391e41eccfcd3bf02ee5e68c6969eca37b7618"} Nov 26 14:51:31 crc kubenswrapper[4651]: I1126 14:51:31.320980 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:31 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:31 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:31 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:31 crc kubenswrapper[4651]: I1126 14:51:31.321027 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.322897 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:32 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:32 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:32 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.323166 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.368483 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.427600 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume\") pod \"695d6bbc-9f78-4920-8186-a77d167378a9\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.427752 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d7qh\" (UniqueName: \"kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh\") pod \"695d6bbc-9f78-4920-8186-a77d167378a9\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.427885 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume\") pod \"695d6bbc-9f78-4920-8186-a77d167378a9\" (UID: \"695d6bbc-9f78-4920-8186-a77d167378a9\") " Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.429847 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "695d6bbc-9f78-4920-8186-a77d167378a9" (UID: "695d6bbc-9f78-4920-8186-a77d167378a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.436494 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "695d6bbc-9f78-4920-8186-a77d167378a9" (UID: "695d6bbc-9f78-4920-8186-a77d167378a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.436891 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh" (OuterVolumeSpecName: "kube-api-access-7d7qh") pod "695d6bbc-9f78-4920-8186-a77d167378a9" (UID: "695d6bbc-9f78-4920-8186-a77d167378a9"). InnerVolumeSpecName "kube-api-access-7d7qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.529227 4651 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/695d6bbc-9f78-4920-8186-a77d167378a9-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.529271 4651 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/695d6bbc-9f78-4920-8186-a77d167378a9-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.529286 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d7qh\" (UniqueName: \"kubernetes.io/projected/695d6bbc-9f78-4920-8186-a77d167378a9-kube-api-access-7d7qh\") on node \"crc\" DevicePath \"\"" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.546448 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.546580 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.547392 4651 patch_prober.go:28] interesting pod/downloads-7954f5f757-v9zwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.547448 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v9zwm" podUID="9010f7b8-93e2-47e6-ab50-16ca7a9b337d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.855675 4651 patch_prober.go:28] interesting pod/console-f9d7485db-q4qzb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Nov 26 14:51:32 crc kubenswrapper[4651]: I1126 14:51:32.855770 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q4qzb" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Nov 26 14:51:33 crc kubenswrapper[4651]: I1126 14:51:33.079257 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" event={"ID":"695d6bbc-9f78-4920-8186-a77d167378a9","Type":"ContainerDied","Data":"e6cf5568af6d9b462adf45850066a5d0872043c5cf3f1c7b59ec19c333de0d57"} Nov 26 14:51:33 crc kubenswrapper[4651]: I1126 14:51:33.079324 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6cf5568af6d9b462adf45850066a5d0872043c5cf3f1c7b59ec19c333de0d57" Nov 26 14:51:33 crc kubenswrapper[4651]: I1126 14:51:33.079431 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402805-z6w5m" Nov 26 14:51:33 crc kubenswrapper[4651]: I1126 14:51:33.332516 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:33 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:33 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:33 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:33 crc kubenswrapper[4651]: I1126 14:51:33.332585 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:34 crc kubenswrapper[4651]: E1126 14:51:34.037007 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:34 crc kubenswrapper[4651]: E1126 14:51:34.053023 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:34 crc kubenswrapper[4651]: E1126 14:51:34.064234 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:34 crc kubenswrapper[4651]: E1126 14:51:34.064306 4651 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:51:34 crc kubenswrapper[4651]: I1126 14:51:34.320751 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:34 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:34 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:34 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:34 crc kubenswrapper[4651]: I1126 14:51:34.321183 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:35 crc kubenswrapper[4651]: I1126 14:51:35.320914 4651 patch_prober.go:28] interesting pod/router-default-5444994796-vw9bz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 14:51:35 crc kubenswrapper[4651]: [-]has-synced failed: reason withheld Nov 26 14:51:35 crc kubenswrapper[4651]: [+]process-running ok Nov 26 14:51:35 crc kubenswrapper[4651]: healthz check failed Nov 26 14:51:35 crc kubenswrapper[4651]: I1126 14:51:35.321105 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vw9bz" podUID="7856b53a-287e-4c39-9f3f-0f384ecc84fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 14:51:36 crc kubenswrapper[4651]: I1126 14:51:36.321012 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:36 crc kubenswrapper[4651]: I1126 14:51:36.325422 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vw9bz" Nov 26 14:51:42 crc kubenswrapper[4651]: I1126 14:51:42.554515 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-v9zwm" Nov 26 14:51:42 crc kubenswrapper[4651]: I1126 14:51:42.987134 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:42 crc kubenswrapper[4651]: I1126 14:51:42.991078 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 14:51:44 crc kubenswrapper[4651]: E1126 14:51:44.036771 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:44 crc kubenswrapper[4651]: E1126 14:51:44.039979 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:44 crc kubenswrapper[4651]: E1126 14:51:44.044138 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:44 crc kubenswrapper[4651]: E1126 14:51:44.044523 4651 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:51:44 crc kubenswrapper[4651]: I1126 14:51:44.613229 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:51:49 crc kubenswrapper[4651]: I1126 14:51:49.417508 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 26 14:51:52 crc kubenswrapper[4651]: I1126 14:51:52.281394 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jzbt6_6defa317-08ba-4208-8537-f7ed45bc26e9/kube-multus-additional-cni-plugins/0.log" Nov 26 14:51:52 crc kubenswrapper[4651]: I1126 14:51:52.281446 4651 generic.go:334] "Generic (PLEG): container finished" podID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" exitCode=137 Nov 26 14:51:52 crc kubenswrapper[4651]: I1126 14:51:52.281485 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" event={"ID":"6defa317-08ba-4208-8537-f7ed45bc26e9","Type":"ContainerDied","Data":"dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb"} Nov 26 14:51:53 crc kubenswrapper[4651]: I1126 14:51:53.422656 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.422637622 podStartE2EDuration="4.422637622s" podCreationTimestamp="2025-11-26 14:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:51:53.419464998 +0000 UTC m=+80.845212612" watchObservedRunningTime="2025-11-26 14:51:53.422637622 +0000 UTC m=+80.848385226" Nov 26 14:51:53 crc kubenswrapper[4651]: I1126 14:51:53.935319 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7zf4d" Nov 26 14:51:54 crc kubenswrapper[4651]: E1126 14:51:54.032785 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:54 crc kubenswrapper[4651]: E1126 14:51:54.033429 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:54 crc kubenswrapper[4651]: E1126 14:51:54.033900 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:51:54 crc kubenswrapper[4651]: E1126 14:51:54.033950 4651 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.032395 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.033299 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.033566 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.033825 4651 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.056061 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.056270 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84h87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kkzc8_openshift-marketplace(495ceaad-8c5b-477a-9630-21fdad21a5da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 14:52:04 crc kubenswrapper[4651]: E1126 14:52:04.058247 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kkzc8" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" Nov 26 14:52:06 crc kubenswrapper[4651]: I1126 14:52:06.165782 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 14:52:06 crc kubenswrapper[4651]: E1126 14:52:06.988304 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kkzc8" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.035235 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jzbt6_6defa317-08ba-4208-8537-f7ed45bc26e9/kube-multus-additional-cni-plugins/0.log" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.035301 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.101207 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist\") pod \"6defa317-08ba-4208-8537-f7ed45bc26e9\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.101259 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir\") pod \"6defa317-08ba-4208-8537-f7ed45bc26e9\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.101306 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready\") pod \"6defa317-08ba-4208-8537-f7ed45bc26e9\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.101367 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fswxq\" (UniqueName: \"kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq\") pod \"6defa317-08ba-4208-8537-f7ed45bc26e9\" (UID: \"6defa317-08ba-4208-8537-f7ed45bc26e9\") " Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.102089 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "6defa317-08ba-4208-8537-f7ed45bc26e9" (UID: "6defa317-08ba-4208-8537-f7ed45bc26e9"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.102226 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready" (OuterVolumeSpecName: "ready") pod "6defa317-08ba-4208-8537-f7ed45bc26e9" (UID: "6defa317-08ba-4208-8537-f7ed45bc26e9"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.102465 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "6defa317-08ba-4208-8537-f7ed45bc26e9" (UID: "6defa317-08ba-4208-8537-f7ed45bc26e9"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.107277 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq" (OuterVolumeSpecName: "kube-api-access-fswxq") pod "6defa317-08ba-4208-8537-f7ed45bc26e9" (UID: "6defa317-08ba-4208-8537-f7ed45bc26e9"). InnerVolumeSpecName "kube-api-access-fswxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.109052 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.109176 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jbcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4gpc7_openshift-marketplace(253c5900-fc2c-440b-a5bc-9731ad0eb9c5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.110338 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4gpc7" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.137997 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.138469 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmplg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j2h89_openshift-marketplace(24d8e3c8-6e05-46be-9e58-349533485f18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.139616 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j2h89" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.156898 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.157057 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dk2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bkcjt_openshift-marketplace(962a3109-87ee-4cdb-9def-3676eb13e46a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.158241 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bkcjt" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.165923 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.166047 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k4n24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qglps_openshift-marketplace(69edd84b-b59a-4094-b20b-a05bbe031a10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.167176 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qglps" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.202633 4651 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6defa317-08ba-4208-8537-f7ed45bc26e9-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.202668 4651 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6defa317-08ba-4208-8537-f7ed45bc26e9-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.202709 4651 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6defa317-08ba-4208-8537-f7ed45bc26e9-ready\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.202723 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fswxq\" (UniqueName: \"kubernetes.io/projected/6defa317-08ba-4208-8537-f7ed45bc26e9-kube-api-access-fswxq\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.368387 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jzbt6_6defa317-08ba-4208-8537-f7ed45bc26e9/kube-multus-additional-cni-plugins/0.log" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.368794 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.368836 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jzbt6" event={"ID":"6defa317-08ba-4208-8537-f7ed45bc26e9","Type":"ContainerDied","Data":"19c8310978a5bd447e255691ef673ad4dc83ddc56f3862f3294589b44fe39c09"} Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.368946 4651 scope.go:117] "RemoveContainer" containerID="dba6480ff5eef8eb99028b4994f5a35d10f361becf82c53db7b0df924f16c5fb" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.372640 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qglps" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.372678 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bkcjt" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.372773 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4gpc7" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" Nov 26 14:52:07 crc kubenswrapper[4651]: E1126 14:52:07.374432 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j2h89" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.431105 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jzbt6"] Nov 26 14:52:07 crc kubenswrapper[4651]: I1126 14:52:07.435330 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jzbt6"] Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.375856 4651 generic.go:334] "Generic (PLEG): container finished" podID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerID="bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966" exitCode=0 Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.376078 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerDied","Data":"bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966"} Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.383754 4651 generic.go:334] "Generic (PLEG): container finished" podID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerID="016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e" exitCode=0 Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.383981 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerDied","Data":"016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e"} Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.388785 4651 generic.go:334] "Generic (PLEG): container finished" podID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerID="60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca" exitCode=0 Nov 26 14:52:08 crc kubenswrapper[4651]: I1126 14:52:08.388813 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerDied","Data":"60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca"} Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.396916 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerStarted","Data":"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37"} Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.400901 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerStarted","Data":"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719"} Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.408612 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" path="/var/lib/kubelet/pods/6defa317-08ba-4208-8537-f7ed45bc26e9/volumes" Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.409284 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerStarted","Data":"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b"} Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.436961 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqpd2" podStartSLOduration=4.093854722 podStartE2EDuration="47.436938104s" podCreationTimestamp="2025-11-26 14:51:22 +0000 UTC" firstStartedPulling="2025-11-26 14:51:25.589651313 +0000 UTC m=+53.015398927" lastFinishedPulling="2025-11-26 14:52:08.932734705 +0000 UTC m=+96.358482309" observedRunningTime="2025-11-26 14:52:09.436125403 +0000 UTC m=+96.861873027" watchObservedRunningTime="2025-11-26 14:52:09.436938104 +0000 UTC m=+96.862685708" Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.437541 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j4n5k" podStartSLOduration=3.09374778 podStartE2EDuration="46.43753408s" podCreationTimestamp="2025-11-26 14:51:23 +0000 UTC" firstStartedPulling="2025-11-26 14:51:25.561650995 +0000 UTC m=+52.987398599" lastFinishedPulling="2025-11-26 14:52:08.905437285 +0000 UTC m=+96.331184899" observedRunningTime="2025-11-26 14:52:09.421427711 +0000 UTC m=+96.847175335" watchObservedRunningTime="2025-11-26 14:52:09.43753408 +0000 UTC m=+96.863281684" Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.456894 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8822d" podStartSLOduration=4.238531536 podStartE2EDuration="47.456875803s" podCreationTimestamp="2025-11-26 14:51:22 +0000 UTC" firstStartedPulling="2025-11-26 14:51:25.604371066 +0000 UTC m=+53.030118670" lastFinishedPulling="2025-11-26 14:52:08.822715343 +0000 UTC m=+96.248462937" observedRunningTime="2025-11-26 14:52:09.452678574 +0000 UTC m=+96.878426198" watchObservedRunningTime="2025-11-26 14:52:09.456875803 +0000 UTC m=+96.882623407" Nov 26 14:52:09 crc kubenswrapper[4651]: I1126 14:52:09.799544 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:52:12 crc kubenswrapper[4651]: I1126 14:52:12.967528 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:12 crc kubenswrapper[4651]: I1126 14:52:12.968766 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.174743 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.174781 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.186164 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.215402 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.387999 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.389112 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.441146 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:13 crc kubenswrapper[4651]: I1126 14:52:13.483007 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:14 crc kubenswrapper[4651]: I1126 14:52:14.464976 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:14 crc kubenswrapper[4651]: I1126 14:52:14.469795 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:15 crc kubenswrapper[4651]: I1126 14:52:15.605673 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:52:15 crc kubenswrapper[4651]: I1126 14:52:15.805003 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:52:15 crc kubenswrapper[4651]: I1126 14:52:15.805316 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8822d" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="registry-server" containerID="cri-o://a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b" gracePeriod=2 Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.163796 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.233230 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5h2z\" (UniqueName: \"kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z\") pod \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.233300 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities\") pod \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.233360 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content\") pod \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\" (UID: \"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.234547 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities" (OuterVolumeSpecName: "utilities") pod "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" (UID: "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.238340 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z" (OuterVolumeSpecName: "kube-api-access-q5h2z") pod "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" (UID: "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140"). InnerVolumeSpecName "kube-api-access-q5h2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.284490 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" (UID: "c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.335083 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5h2z\" (UniqueName: \"kubernetes.io/projected/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-kube-api-access-q5h2z\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.335124 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.335137 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438425 4651 generic.go:334] "Generic (PLEG): container finished" podID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerID="a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b" exitCode=0 Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438458 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8822d" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438481 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerDied","Data":"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b"} Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438524 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8822d" event={"ID":"c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140","Type":"ContainerDied","Data":"84285533d09d8f7e4977ef0c726e0fd19d80179f6755901a094a02a9bf25d1ae"} Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438540 4651 scope.go:117] "RemoveContainer" containerID="a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.438850 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j4n5k" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="registry-server" containerID="cri-o://39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37" gracePeriod=2 Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.454904 4651 scope.go:117] "RemoveContainer" containerID="016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.466573 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.473790 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8822d"] Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.511636 4651 scope.go:117] "RemoveContainer" containerID="95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.572542 4651 scope.go:117] "RemoveContainer" containerID="a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b" Nov 26 14:52:16 crc kubenswrapper[4651]: E1126 14:52:16.573063 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b\": container with ID starting with a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b not found: ID does not exist" containerID="a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.573115 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b"} err="failed to get container status \"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b\": rpc error: code = NotFound desc = could not find container \"a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b\": container with ID starting with a201c132828f7d4f33bfa732593f1b3dbd87d0bd6ea10efbdcf6b447f1c3c58b not found: ID does not exist" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.573171 4651 scope.go:117] "RemoveContainer" containerID="016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e" Nov 26 14:52:16 crc kubenswrapper[4651]: E1126 14:52:16.573525 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e\": container with ID starting with 016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e not found: ID does not exist" containerID="016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.573566 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e"} err="failed to get container status \"016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e\": rpc error: code = NotFound desc = could not find container \"016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e\": container with ID starting with 016615384ae5ceca448e3080cf24621a7ebd6941b33e4e9cc787492d80dc3f2e not found: ID does not exist" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.573594 4651 scope.go:117] "RemoveContainer" containerID="95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc" Nov 26 14:52:16 crc kubenswrapper[4651]: E1126 14:52:16.573843 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc\": container with ID starting with 95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc not found: ID does not exist" containerID="95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.573875 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc"} err="failed to get container status \"95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc\": rpc error: code = NotFound desc = could not find container \"95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc\": container with ID starting with 95fbfc6f5289f90d3bb106d4dce269972320ba29e999b97c3ffd7c7e768894cc not found: ID does not exist" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.843098 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.941937 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nx55\" (UniqueName: \"kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55\") pod \"6182a634-5814-40aa-9ef0-419481ca7c1d\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.942070 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities\") pod \"6182a634-5814-40aa-9ef0-419481ca7c1d\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.942120 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content\") pod \"6182a634-5814-40aa-9ef0-419481ca7c1d\" (UID: \"6182a634-5814-40aa-9ef0-419481ca7c1d\") " Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.942755 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities" (OuterVolumeSpecName: "utilities") pod "6182a634-5814-40aa-9ef0-419481ca7c1d" (UID: "6182a634-5814-40aa-9ef0-419481ca7c1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:16 crc kubenswrapper[4651]: I1126 14:52:16.946175 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55" (OuterVolumeSpecName: "kube-api-access-6nx55") pod "6182a634-5814-40aa-9ef0-419481ca7c1d" (UID: "6182a634-5814-40aa-9ef0-419481ca7c1d"). InnerVolumeSpecName "kube-api-access-6nx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.008831 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6182a634-5814-40aa-9ef0-419481ca7c1d" (UID: "6182a634-5814-40aa-9ef0-419481ca7c1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.043422 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.043452 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6182a634-5814-40aa-9ef0-419481ca7c1d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.043465 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nx55\" (UniqueName: \"kubernetes.io/projected/6182a634-5814-40aa-9ef0-419481ca7c1d-kube-api-access-6nx55\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.412737 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" path="/var/lib/kubelet/pods/c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140/volumes" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.445053 4651 generic.go:334] "Generic (PLEG): container finished" podID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerID="39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37" exitCode=0 Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.445125 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerDied","Data":"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37"} Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.445156 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j4n5k" event={"ID":"6182a634-5814-40aa-9ef0-419481ca7c1d","Type":"ContainerDied","Data":"a10833ce7b060a83f49d4bc0b549b57d3c0c79f19147e357606b36a7761286f7"} Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.445176 4651 scope.go:117] "RemoveContainer" containerID="39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.445271 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j4n5k" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.471085 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.474858 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j4n5k"] Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.474871 4651 scope.go:117] "RemoveContainer" containerID="60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.493476 4651 scope.go:117] "RemoveContainer" containerID="6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.507934 4651 scope.go:117] "RemoveContainer" containerID="39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37" Nov 26 14:52:17 crc kubenswrapper[4651]: E1126 14:52:17.508428 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37\": container with ID starting with 39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37 not found: ID does not exist" containerID="39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.508467 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37"} err="failed to get container status \"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37\": rpc error: code = NotFound desc = could not find container \"39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37\": container with ID starting with 39c8d3d34d0bf8bdafe6ad6c398ff7e96c681701097b549b9a982a80acf80b37 not found: ID does not exist" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.508491 4651 scope.go:117] "RemoveContainer" containerID="60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca" Nov 26 14:52:17 crc kubenswrapper[4651]: E1126 14:52:17.508779 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca\": container with ID starting with 60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca not found: ID does not exist" containerID="60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.508878 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca"} err="failed to get container status \"60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca\": rpc error: code = NotFound desc = could not find container \"60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca\": container with ID starting with 60320f12e5aab7ecae7e16cec6c43b40cac60bf1b7551f1c05c98af8275ae0ca not found: ID does not exist" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.508970 4651 scope.go:117] "RemoveContainer" containerID="6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa" Nov 26 14:52:17 crc kubenswrapper[4651]: E1126 14:52:17.509734 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa\": container with ID starting with 6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa not found: ID does not exist" containerID="6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa" Nov 26 14:52:17 crc kubenswrapper[4651]: I1126 14:52:17.509783 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa"} err="failed to get container status \"6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa\": rpc error: code = NotFound desc = could not find container \"6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa\": container with ID starting with 6b7c654dd072f48bae822b6446c70d67679e23cf2a89edb0c6461a15cf7569fa not found: ID does not exist" Nov 26 14:52:18 crc kubenswrapper[4651]: I1126 14:52:18.456080 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerStarted","Data":"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1"} Nov 26 14:52:19 crc kubenswrapper[4651]: I1126 14:52:19.407492 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" path="/var/lib/kubelet/pods/6182a634-5814-40aa-9ef0-419481ca7c1d/volumes" Nov 26 14:52:19 crc kubenswrapper[4651]: I1126 14:52:19.461563 4651 generic.go:334] "Generic (PLEG): container finished" podID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerID="db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1" exitCode=0 Nov 26 14:52:19 crc kubenswrapper[4651]: I1126 14:52:19.461597 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerDied","Data":"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1"} Nov 26 14:52:20 crc kubenswrapper[4651]: I1126 14:52:20.461455 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 26 14:52:20 crc kubenswrapper[4651]: I1126 14:52:20.469434 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerStarted","Data":"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e"} Nov 26 14:52:20 crc kubenswrapper[4651]: I1126 14:52:20.541494 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kkzc8" podStartSLOduration=2.15406698 podStartE2EDuration="56.54147782s" podCreationTimestamp="2025-11-26 14:51:24 +0000 UTC" firstStartedPulling="2025-11-26 14:51:25.635570289 +0000 UTC m=+53.061317893" lastFinishedPulling="2025-11-26 14:52:20.022981129 +0000 UTC m=+107.448728733" observedRunningTime="2025-11-26 14:52:20.53877384 +0000 UTC m=+107.964521444" watchObservedRunningTime="2025-11-26 14:52:20.54147782 +0000 UTC m=+107.967225414" Nov 26 14:52:21 crc kubenswrapper[4651]: I1126 14:52:21.420929 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.420912522 podStartE2EDuration="1.420912522s" podCreationTimestamp="2025-11-26 14:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:52:20.582407515 +0000 UTC m=+108.008155139" watchObservedRunningTime="2025-11-26 14:52:21.420912522 +0000 UTC m=+108.846660126" Nov 26 14:52:21 crc kubenswrapper[4651]: I1126 14:52:21.475435 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerStarted","Data":"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28"} Nov 26 14:52:21 crc kubenswrapper[4651]: I1126 14:52:21.479593 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerStarted","Data":"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4"} Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.484957 4651 generic.go:334] "Generic (PLEG): container finished" podID="24d8e3c8-6e05-46be-9e58-349533485f18" containerID="33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4" exitCode=0 Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.485130 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerDied","Data":"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4"} Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.489676 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerStarted","Data":"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf"} Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.493868 4651 generic.go:334] "Generic (PLEG): container finished" podID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerID="b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f" exitCode=0 Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.493956 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerDied","Data":"b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f"} Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.496731 4651 generic.go:334] "Generic (PLEG): container finished" podID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerID="18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28" exitCode=0 Nov 26 14:52:22 crc kubenswrapper[4651]: I1126 14:52:22.496772 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerDied","Data":"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28"} Nov 26 14:52:23 crc kubenswrapper[4651]: I1126 14:52:23.504204 4651 generic.go:334] "Generic (PLEG): container finished" podID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerID="6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf" exitCode=0 Nov 26 14:52:23 crc kubenswrapper[4651]: I1126 14:52:23.504247 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerDied","Data":"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf"} Nov 26 14:52:24 crc kubenswrapper[4651]: I1126 14:52:24.745334 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:24 crc kubenswrapper[4651]: I1126 14:52:24.745379 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:24 crc kubenswrapper[4651]: I1126 14:52:24.789697 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:25 crc kubenswrapper[4651]: I1126 14:52:25.527503 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerStarted","Data":"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb"} Nov 26 14:52:25 crc kubenswrapper[4651]: I1126 14:52:25.555114 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2h89" podStartSLOduration=3.864854514 podStartE2EDuration="1m1.55502982s" podCreationTimestamp="2025-11-26 14:51:24 +0000 UTC" firstStartedPulling="2025-11-26 14:51:26.670554988 +0000 UTC m=+54.096302592" lastFinishedPulling="2025-11-26 14:52:24.360730294 +0000 UTC m=+111.786477898" observedRunningTime="2025-11-26 14:52:25.550696977 +0000 UTC m=+112.976444581" watchObservedRunningTime="2025-11-26 14:52:25.55502982 +0000 UTC m=+112.980777434" Nov 26 14:52:25 crc kubenswrapper[4651]: I1126 14:52:25.569931 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:27 crc kubenswrapper[4651]: I1126 14:52:27.544610 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerStarted","Data":"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676"} Nov 26 14:52:27 crc kubenswrapper[4651]: I1126 14:52:27.567908 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bkcjt" podStartSLOduration=5.001996972 podStartE2EDuration="1m5.567891103s" podCreationTimestamp="2025-11-26 14:51:22 +0000 UTC" firstStartedPulling="2025-11-26 14:51:25.576193123 +0000 UTC m=+53.001940727" lastFinishedPulling="2025-11-26 14:52:26.142087254 +0000 UTC m=+113.567834858" observedRunningTime="2025-11-26 14:52:27.567578295 +0000 UTC m=+114.993325899" watchObservedRunningTime="2025-11-26 14:52:27.567891103 +0000 UTC m=+114.993638707" Nov 26 14:52:29 crc kubenswrapper[4651]: I1126 14:52:29.558083 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerStarted","Data":"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554"} Nov 26 14:52:29 crc kubenswrapper[4651]: I1126 14:52:29.579325 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4gpc7" podStartSLOduration=3.250014624 podStartE2EDuration="1m3.579305519s" podCreationTimestamp="2025-11-26 14:51:26 +0000 UTC" firstStartedPulling="2025-11-26 14:51:27.865699085 +0000 UTC m=+55.291446689" lastFinishedPulling="2025-11-26 14:52:28.19498998 +0000 UTC m=+115.620737584" observedRunningTime="2025-11-26 14:52:29.574505924 +0000 UTC m=+117.000253548" watchObservedRunningTime="2025-11-26 14:52:29.579305519 +0000 UTC m=+117.005053123" Nov 26 14:52:30 crc kubenswrapper[4651]: I1126 14:52:30.565387 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerStarted","Data":"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3"} Nov 26 14:52:31 crc kubenswrapper[4651]: I1126 14:52:31.596971 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qglps" podStartSLOduration=4.877565023 podStartE2EDuration="1m6.596953227s" podCreationTimestamp="2025-11-26 14:51:25 +0000 UTC" firstStartedPulling="2025-11-26 14:51:27.85972354 +0000 UTC m=+55.285471144" lastFinishedPulling="2025-11-26 14:52:29.579111744 +0000 UTC m=+117.004859348" observedRunningTime="2025-11-26 14:52:31.596149856 +0000 UTC m=+119.021897470" watchObservedRunningTime="2025-11-26 14:52:31.596953227 +0000 UTC m=+119.022700831" Nov 26 14:52:32 crc kubenswrapper[4651]: I1126 14:52:32.748331 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:32 crc kubenswrapper[4651]: I1126 14:52:32.750294 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:32 crc kubenswrapper[4651]: I1126 14:52:32.796556 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:33 crc kubenswrapper[4651]: I1126 14:52:33.618269 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:34 crc kubenswrapper[4651]: I1126 14:52:34.828929 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerName="oauth-openshift" containerID="cri-o://c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2" gracePeriod=15 Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.177975 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.223676 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-84944f9bdc-k2tlm"] Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.223913 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="extract-utilities" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.223933 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="extract-utilities" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.223947 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="extract-content" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.223953 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="extract-content" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.223963 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df8db8a-1258-4f76-8131-cbba03509cb2" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.223969 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df8db8a-1258-4f76-8131-cbba03509cb2" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.223980 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.223987 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.223996 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="extract-utilities" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224002 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="extract-utilities" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224010 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224016 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224026 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695d6bbc-9f78-4920-8186-a77d167378a9" containerName="collect-profiles" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224047 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="695d6bbc-9f78-4920-8186-a77d167378a9" containerName="collect-profiles" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224056 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="extract-content" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224062 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="extract-content" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224069 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224075 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224083 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224088 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.224094 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerName="oauth-openshift" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224102 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerName="oauth-openshift" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224194 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b9f6b3-ffe1-470d-8fb8-94bce5a3d140" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224202 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="695d6bbc-9f78-4920-8186-a77d167378a9" containerName="collect-profiles" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224211 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd29a15c-636b-4ed9-ae3d-e1b9f2e41415" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224219 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="6182a634-5814-40aa-9ef0-419481ca7c1d" containerName="registry-server" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224227 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerName="oauth-openshift" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224238 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df8db8a-1258-4f76-8131-cbba03509cb2" containerName="pruner" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224246 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="6defa317-08ba-4208-8537-f7ed45bc26e9" containerName="kube-multus-additional-cni-plugins" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.224618 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.228104 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84944f9bdc-k2tlm"] Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.289285 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.289332 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.326616 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348321 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348374 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348430 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348449 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348467 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348494 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348520 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348544 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348822 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348861 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348892 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348916 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348936 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfd8c\" (UniqueName: \"kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.348969 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies\") pod \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\" (UID: \"1b1058c7-8ca9-41f7-b961-0b48e973c6c6\") " Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349073 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349104 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxgm\" (UniqueName: \"kubernetes.io/projected/800bcde2-0851-4433-8dd5-b46f8f40ceb7-kube-api-access-mhxgm\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349128 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-session\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349152 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349178 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-login\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349202 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-policies\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349228 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349281 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-error\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349618 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349647 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349672 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349698 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-dir\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349718 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.349759 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.350761 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.350870 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.351172 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.351231 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.351538 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.354180 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.354731 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.355010 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.355386 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.355516 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.356373 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.361329 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.362463 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.368662 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c" (OuterVolumeSpecName: "kube-api-access-sfd8c") pod "1b1058c7-8ca9-41f7-b961-0b48e973c6c6" (UID: "1b1058c7-8ca9-41f7-b961-0b48e973c6c6"). InnerVolumeSpecName "kube-api-access-sfd8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.450346 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-dir\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.450385 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.450416 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-dir\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451262 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451419 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451816 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451849 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxgm\" (UniqueName: \"kubernetes.io/projected/800bcde2-0851-4433-8dd5-b46f8f40ceb7-kube-api-access-mhxgm\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451869 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-session\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451888 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451923 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-login\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451946 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-policies\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451968 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.451992 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-error\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452017 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452076 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452098 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452226 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452238 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452249 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452258 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfd8c\" (UniqueName: \"kubernetes.io/projected/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-kube-api-access-sfd8c\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452268 4651 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452277 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452286 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452298 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452307 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452316 4651 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452324 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452333 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452341 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452349 4651 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1b1058c7-8ca9-41f7-b961-0b48e973c6c6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452559 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-service-ca\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.452949 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-audit-policies\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.453222 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.456051 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-error\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.456347 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.456469 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-template-login\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.456488 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.456695 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.457098 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.457121 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-session\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.457959 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/800bcde2-0851-4433-8dd5-b46f8f40ceb7-v4-0-config-system-router-certs\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.466468 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxgm\" (UniqueName: \"kubernetes.io/projected/800bcde2-0851-4433-8dd5-b46f8f40ceb7-kube-api-access-mhxgm\") pod \"oauth-openshift-84944f9bdc-k2tlm\" (UID: \"800bcde2-0851-4433-8dd5-b46f8f40ceb7\") " pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.542832 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.593207 4651 generic.go:334] "Generic (PLEG): container finished" podID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" containerID="c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2" exitCode=0 Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.593416 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" event={"ID":"1b1058c7-8ca9-41f7-b961-0b48e973c6c6","Type":"ContainerDied","Data":"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2"} Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.593522 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.593544 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9h5h8" event={"ID":"1b1058c7-8ca9-41f7-b961-0b48e973c6c6","Type":"ContainerDied","Data":"3e36e53bfe1f4916cd7c6b423c0d27dd0a4c9190a2a6150a9866ab43964b638d"} Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.593575 4651 scope.go:117] "RemoveContainer" containerID="c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.629802 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.631884 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9h5h8"] Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.639119 4651 scope.go:117] "RemoveContainer" containerID="c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2" Nov 26 14:52:35 crc kubenswrapper[4651]: E1126 14:52:35.639621 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2\": container with ID starting with c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2 not found: ID does not exist" containerID="c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.639663 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2"} err="failed to get container status \"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2\": rpc error: code = NotFound desc = could not find container \"c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2\": container with ID starting with c6c9c4116984e74541cab10e1c835eac38e207c93ac365e094dbae92aae81ec2 not found: ID does not exist" Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.651020 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:35 crc kubenswrapper[4651]: W1126 14:52:35.726694 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800bcde2_0851_4433_8dd5_b46f8f40ceb7.slice/crio-6671a6d418058c589dbd1e6046e9f12f6f12a5843091f307d59cd0685aff7c60 WatchSource:0}: Error finding container 6671a6d418058c589dbd1e6046e9f12f6f12a5843091f307d59cd0685aff7c60: Status 404 returned error can't find the container with id 6671a6d418058c589dbd1e6046e9f12f6f12a5843091f307d59cd0685aff7c60 Nov 26 14:52:35 crc kubenswrapper[4651]: I1126 14:52:35.727814 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84944f9bdc-k2tlm"] Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.018019 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.018352 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.066753 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.399143 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.399231 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.452632 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.601323 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" event={"ID":"800bcde2-0851-4433-8dd5-b46f8f40ceb7","Type":"ContainerStarted","Data":"51e5ed8540ae1a25f83820d7d3d0ca25f8ddb04d28b36603dc01b23267723e04"} Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.601385 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" event={"ID":"800bcde2-0851-4433-8dd5-b46f8f40ceb7","Type":"ContainerStarted","Data":"6671a6d418058c589dbd1e6046e9f12f6f12a5843091f307d59cd0685aff7c60"} Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.601809 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.634728 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" podStartSLOduration=27.634700786 podStartE2EDuration="27.634700786s" podCreationTimestamp="2025-11-26 14:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:52:36.6344584 +0000 UTC m=+124.060206004" watchObservedRunningTime="2025-11-26 14:52:36.634700786 +0000 UTC m=+124.060448390" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.646499 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.650928 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:36 crc kubenswrapper[4651]: I1126 14:52:36.837011 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-84944f9bdc-k2tlm" Nov 26 14:52:37 crc kubenswrapper[4651]: I1126 14:52:37.411835 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1058c7-8ca9-41f7-b961-0b48e973c6c6" path="/var/lib/kubelet/pods/1b1058c7-8ca9-41f7-b961-0b48e973c6c6/volumes" Nov 26 14:52:38 crc kubenswrapper[4651]: I1126 14:52:38.806182 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:52:38 crc kubenswrapper[4651]: I1126 14:52:38.806450 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2h89" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="registry-server" containerID="cri-o://e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb" gracePeriod=2 Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.177774 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.297419 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities\") pod \"24d8e3c8-6e05-46be-9e58-349533485f18\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.297534 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmplg\" (UniqueName: \"kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg\") pod \"24d8e3c8-6e05-46be-9e58-349533485f18\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.297611 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content\") pod \"24d8e3c8-6e05-46be-9e58-349533485f18\" (UID: \"24d8e3c8-6e05-46be-9e58-349533485f18\") " Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.298350 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities" (OuterVolumeSpecName: "utilities") pod "24d8e3c8-6e05-46be-9e58-349533485f18" (UID: "24d8e3c8-6e05-46be-9e58-349533485f18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.302472 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg" (OuterVolumeSpecName: "kube-api-access-kmplg") pod "24d8e3c8-6e05-46be-9e58-349533485f18" (UID: "24d8e3c8-6e05-46be-9e58-349533485f18"). InnerVolumeSpecName "kube-api-access-kmplg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.315109 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24d8e3c8-6e05-46be-9e58-349533485f18" (UID: "24d8e3c8-6e05-46be-9e58-349533485f18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.398797 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.398844 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d8e3c8-6e05-46be-9e58-349533485f18-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.398858 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmplg\" (UniqueName: \"kubernetes.io/projected/24d8e3c8-6e05-46be-9e58-349533485f18-kube-api-access-kmplg\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.408413 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.620508 4651 generic.go:334] "Generic (PLEG): container finished" podID="24d8e3c8-6e05-46be-9e58-349533485f18" containerID="e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb" exitCode=0 Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.620586 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2h89" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.620606 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerDied","Data":"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb"} Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.620672 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2h89" event={"ID":"24d8e3c8-6e05-46be-9e58-349533485f18","Type":"ContainerDied","Data":"8581077f77b5c91b67c5405426072e8bb1416a4c9a3dc46c158e4f158518f64a"} Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.620710 4651 scope.go:117] "RemoveContainer" containerID="e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.621017 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4gpc7" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="registry-server" containerID="cri-o://9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554" gracePeriod=2 Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.636537 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.640354 4651 scope.go:117] "RemoveContainer" containerID="33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.641289 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2h89"] Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.661447 4651 scope.go:117] "RemoveContainer" containerID="2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.678338 4651 scope.go:117] "RemoveContainer" containerID="e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb" Nov 26 14:52:39 crc kubenswrapper[4651]: E1126 14:52:39.678903 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb\": container with ID starting with e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb not found: ID does not exist" containerID="e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.678958 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb"} err="failed to get container status \"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb\": rpc error: code = NotFound desc = could not find container \"e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb\": container with ID starting with e809e9e93ca74c2d83b03bfcf8903f1687dbd3f2293c5f173d95c9f4a6452adb not found: ID does not exist" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.679000 4651 scope.go:117] "RemoveContainer" containerID="33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4" Nov 26 14:52:39 crc kubenswrapper[4651]: E1126 14:52:39.679501 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4\": container with ID starting with 33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4 not found: ID does not exist" containerID="33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.679536 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4"} err="failed to get container status \"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4\": rpc error: code = NotFound desc = could not find container \"33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4\": container with ID starting with 33ce06e7a01c73705051deb9a4c3ab9174a4a81f2a262ac6f682b285290a8aa4 not found: ID does not exist" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.679556 4651 scope.go:117] "RemoveContainer" containerID="2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae" Nov 26 14:52:39 crc kubenswrapper[4651]: E1126 14:52:39.679937 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae\": container with ID starting with 2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae not found: ID does not exist" containerID="2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.679967 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae"} err="failed to get container status \"2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae\": rpc error: code = NotFound desc = could not find container \"2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae\": container with ID starting with 2af89da25844fd32f38e778184937f1736910d31caeb1204da735845dbf73fae not found: ID does not exist" Nov 26 14:52:39 crc kubenswrapper[4651]: I1126 14:52:39.902655 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.005466 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities\") pod \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.005609 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content\") pod \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.006405 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities" (OuterVolumeSpecName: "utilities") pod "253c5900-fc2c-440b-a5bc-9731ad0eb9c5" (UID: "253c5900-fc2c-440b-a5bc-9731ad0eb9c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.007235 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jbcz\" (UniqueName: \"kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz\") pod \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\" (UID: \"253c5900-fc2c-440b-a5bc-9731ad0eb9c5\") " Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.007613 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.015258 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz" (OuterVolumeSpecName: "kube-api-access-8jbcz") pod "253c5900-fc2c-440b-a5bc-9731ad0eb9c5" (UID: "253c5900-fc2c-440b-a5bc-9731ad0eb9c5"). InnerVolumeSpecName "kube-api-access-8jbcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.091778 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "253c5900-fc2c-440b-a5bc-9731ad0eb9c5" (UID: "253c5900-fc2c-440b-a5bc-9731ad0eb9c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.108650 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.108722 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jbcz\" (UniqueName: \"kubernetes.io/projected/253c5900-fc2c-440b-a5bc-9731ad0eb9c5-kube-api-access-8jbcz\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.628639 4651 generic.go:334] "Generic (PLEG): container finished" podID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerID="9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554" exitCode=0 Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.628823 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerDied","Data":"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554"} Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.629410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gpc7" event={"ID":"253c5900-fc2c-440b-a5bc-9731ad0eb9c5","Type":"ContainerDied","Data":"f3a5ad313dad5a3a145f2e73eeeccd33882692dc0eb01e3d79f43653a0316733"} Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.629450 4651 scope.go:117] "RemoveContainer" containerID="9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.628894 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gpc7" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.646291 4651 scope.go:117] "RemoveContainer" containerID="b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.658275 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.661017 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4gpc7"] Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.671254 4651 scope.go:117] "RemoveContainer" containerID="a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.684246 4651 scope.go:117] "RemoveContainer" containerID="9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554" Nov 26 14:52:40 crc kubenswrapper[4651]: E1126 14:52:40.684760 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554\": container with ID starting with 9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554 not found: ID does not exist" containerID="9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.684795 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554"} err="failed to get container status \"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554\": rpc error: code = NotFound desc = could not find container \"9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554\": container with ID starting with 9e5704f140585823042fba0dd4132a32120f594d1d5c865f87c798403e582554 not found: ID does not exist" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.684821 4651 scope.go:117] "RemoveContainer" containerID="b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f" Nov 26 14:52:40 crc kubenswrapper[4651]: E1126 14:52:40.685416 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f\": container with ID starting with b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f not found: ID does not exist" containerID="b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.685451 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f"} err="failed to get container status \"b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f\": rpc error: code = NotFound desc = could not find container \"b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f\": container with ID starting with b746ab94cce76ca27a10a59017ee6c5f61909e647ca4d771064d4578e45f878f not found: ID does not exist" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.685469 4651 scope.go:117] "RemoveContainer" containerID="a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814" Nov 26 14:52:40 crc kubenswrapper[4651]: E1126 14:52:40.685742 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814\": container with ID starting with a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814 not found: ID does not exist" containerID="a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814" Nov 26 14:52:40 crc kubenswrapper[4651]: I1126 14:52:40.685773 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814"} err="failed to get container status \"a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814\": rpc error: code = NotFound desc = could not find container \"a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814\": container with ID starting with a71820aea95e5c9b18c9c72994cc658fefe3ec6208ae12b91b3a14fb053f0814 not found: ID does not exist" Nov 26 14:52:41 crc kubenswrapper[4651]: I1126 14:52:41.408483 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" path="/var/lib/kubelet/pods/24d8e3c8-6e05-46be-9e58-349533485f18/volumes" Nov 26 14:52:41 crc kubenswrapper[4651]: I1126 14:52:41.409272 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" path="/var/lib/kubelet/pods/253c5900-fc2c-440b-a5bc-9731ad0eb9c5/volumes" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.672258 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.673071 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bkcjt" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="registry-server" containerID="cri-o://6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676" gracePeriod=30 Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.686765 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.687077 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqpd2" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="registry-server" containerID="cri-o://10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719" gracePeriod=30 Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.696226 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.696664 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" containerID="cri-o://d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e" gracePeriod=30 Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.711316 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.711555 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kkzc8" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="registry-server" containerID="cri-o://54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e" gracePeriod=30 Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.741462 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.741847 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qglps" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="registry-server" containerID="cri-o://2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3" gracePeriod=30 Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.750492 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhtwk"] Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751354 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="extract-content" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751376 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="extract-content" Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751401 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751408 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751417 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="extract-content" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751424 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="extract-content" Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751431 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751437 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751449 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="extract-utilities" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751457 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="extract-utilities" Nov 26 14:52:50 crc kubenswrapper[4651]: E1126 14:52:50.751468 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="extract-utilities" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751475 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="extract-utilities" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751588 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="253c5900-fc2c-440b-a5bc-9731ad0eb9c5" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.751600 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d8e3c8-6e05-46be-9e58-349533485f18" containerName="registry-server" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.752259 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.772210 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhtwk"] Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.942177 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.942547 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpbsp\" (UniqueName: \"kubernetes.io/projected/ab73482d-4b1c-481f-9728-36d8505e8a9b-kube-api-access-rpbsp\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:50 crc kubenswrapper[4651]: I1126 14:52:50.942581 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.043425 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.043477 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpbsp\" (UniqueName: \"kubernetes.io/projected/ab73482d-4b1c-481f-9728-36d8505e8a9b-kube-api-access-rpbsp\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.043498 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.045284 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.059174 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ab73482d-4b1c-481f-9728-36d8505e8a9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.061283 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpbsp\" (UniqueName: \"kubernetes.io/projected/ab73482d-4b1c-481f-9728-36d8505e8a9b-kube-api-access-rpbsp\") pod \"marketplace-operator-79b997595-zhtwk\" (UID: \"ab73482d-4b1c-481f-9728-36d8505e8a9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.107718 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.144464 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content\") pod \"962a3109-87ee-4cdb-9def-3676eb13e46a\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.144544 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities\") pod \"962a3109-87ee-4cdb-9def-3676eb13e46a\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.144650 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dk2f\" (UniqueName: \"kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f\") pod \"962a3109-87ee-4cdb-9def-3676eb13e46a\" (UID: \"962a3109-87ee-4cdb-9def-3676eb13e46a\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.146303 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities" (OuterVolumeSpecName: "utilities") pod "962a3109-87ee-4cdb-9def-3676eb13e46a" (UID: "962a3109-87ee-4cdb-9def-3676eb13e46a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.149813 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f" (OuterVolumeSpecName: "kube-api-access-2dk2f") pod "962a3109-87ee-4cdb-9def-3676eb13e46a" (UID: "962a3109-87ee-4cdb-9def-3676eb13e46a"). InnerVolumeSpecName "kube-api-access-2dk2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.204811 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "962a3109-87ee-4cdb-9def-3676eb13e46a" (UID: "962a3109-87ee-4cdb-9def-3676eb13e46a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.207554 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245364 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities\") pod \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245425 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94lrw\" (UniqueName: \"kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw\") pod \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245486 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content\") pod \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\" (UID: \"e13d4652-4c12-40d7-bb77-edb7ce43bd47\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245728 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dk2f\" (UniqueName: \"kubernetes.io/projected/962a3109-87ee-4cdb-9def-3676eb13e46a-kube-api-access-2dk2f\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245747 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.245756 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962a3109-87ee-4cdb-9def-3676eb13e46a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.247542 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities" (OuterVolumeSpecName: "utilities") pod "e13d4652-4c12-40d7-bb77-edb7ce43bd47" (UID: "e13d4652-4c12-40d7-bb77-edb7ce43bd47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.249967 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw" (OuterVolumeSpecName: "kube-api-access-94lrw") pod "e13d4652-4c12-40d7-bb77-edb7ce43bd47" (UID: "e13d4652-4c12-40d7-bb77-edb7ce43bd47"). InnerVolumeSpecName "kube-api-access-94lrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.254227 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.260195 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.290443 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.324909 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.332520 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e13d4652-4c12-40d7-bb77-edb7ce43bd47" (UID: "e13d4652-4c12-40d7-bb77-edb7ce43bd47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.346689 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.346725 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94lrw\" (UniqueName: \"kubernetes.io/projected/e13d4652-4c12-40d7-bb77-edb7ce43bd47-kube-api-access-94lrw\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.346736 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13d4652-4c12-40d7-bb77-edb7ce43bd47-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447658 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84h87\" (UniqueName: \"kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87\") pod \"495ceaad-8c5b-477a-9630-21fdad21a5da\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447706 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics\") pod \"b1e02d51-3be7-4c15-9e50-f446bca05403\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447745 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k22tk\" (UniqueName: \"kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk\") pod \"b1e02d51-3be7-4c15-9e50-f446bca05403\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447795 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content\") pod \"495ceaad-8c5b-477a-9630-21fdad21a5da\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447843 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities\") pod \"69edd84b-b59a-4094-b20b-a05bbe031a10\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447892 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4n24\" (UniqueName: \"kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24\") pod \"69edd84b-b59a-4094-b20b-a05bbe031a10\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447924 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca\") pod \"b1e02d51-3be7-4c15-9e50-f446bca05403\" (UID: \"b1e02d51-3be7-4c15-9e50-f446bca05403\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.447979 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content\") pod \"69edd84b-b59a-4094-b20b-a05bbe031a10\" (UID: \"69edd84b-b59a-4094-b20b-a05bbe031a10\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.448000 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities\") pod \"495ceaad-8c5b-477a-9630-21fdad21a5da\" (UID: \"495ceaad-8c5b-477a-9630-21fdad21a5da\") " Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.449296 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b1e02d51-3be7-4c15-9e50-f446bca05403" (UID: "b1e02d51-3be7-4c15-9e50-f446bca05403"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.449346 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities" (OuterVolumeSpecName: "utilities") pod "495ceaad-8c5b-477a-9630-21fdad21a5da" (UID: "495ceaad-8c5b-477a-9630-21fdad21a5da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.453218 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24" (OuterVolumeSpecName: "kube-api-access-k4n24") pod "69edd84b-b59a-4094-b20b-a05bbe031a10" (UID: "69edd84b-b59a-4094-b20b-a05bbe031a10"). InnerVolumeSpecName "kube-api-access-k4n24". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.457785 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk" (OuterVolumeSpecName: "kube-api-access-k22tk") pod "b1e02d51-3be7-4c15-9e50-f446bca05403" (UID: "b1e02d51-3be7-4c15-9e50-f446bca05403"). InnerVolumeSpecName "kube-api-access-k22tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.460429 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87" (OuterVolumeSpecName: "kube-api-access-84h87") pod "495ceaad-8c5b-477a-9630-21fdad21a5da" (UID: "495ceaad-8c5b-477a-9630-21fdad21a5da"). InnerVolumeSpecName "kube-api-access-84h87". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.468957 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities" (OuterVolumeSpecName: "utilities") pod "69edd84b-b59a-4094-b20b-a05bbe031a10" (UID: "69edd84b-b59a-4094-b20b-a05bbe031a10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.476407 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "495ceaad-8c5b-477a-9630-21fdad21a5da" (UID: "495ceaad-8c5b-477a-9630-21fdad21a5da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.481401 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b1e02d51-3be7-4c15-9e50-f446bca05403" (UID: "b1e02d51-3be7-4c15-9e50-f446bca05403"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.532483 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhtwk"] Nov 26 14:52:51 crc kubenswrapper[4651]: W1126 14:52:51.541618 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab73482d_4b1c_481f_9728_36d8505e8a9b.slice/crio-1c87d28eabbebdb7aeb208c6f9c46245f2517dae8ae7a1b362b5fc085c0c7cee WatchSource:0}: Error finding container 1c87d28eabbebdb7aeb208c6f9c46245f2517dae8ae7a1b362b5fc085c0c7cee: Status 404 returned error can't find the container with id 1c87d28eabbebdb7aeb208c6f9c46245f2517dae8ae7a1b362b5fc085c0c7cee Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.548958 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84h87\" (UniqueName: \"kubernetes.io/projected/495ceaad-8c5b-477a-9630-21fdad21a5da-kube-api-access-84h87\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.548976 4651 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.548984 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k22tk\" (UniqueName: \"kubernetes.io/projected/b1e02d51-3be7-4c15-9e50-f446bca05403-kube-api-access-k22tk\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.549011 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.549019 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.549028 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4n24\" (UniqueName: \"kubernetes.io/projected/69edd84b-b59a-4094-b20b-a05bbe031a10-kube-api-access-k4n24\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.549063 4651 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e02d51-3be7-4c15-9e50-f446bca05403-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.549073 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/495ceaad-8c5b-477a-9630-21fdad21a5da-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.551936 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69edd84b-b59a-4094-b20b-a05bbe031a10" (UID: "69edd84b-b59a-4094-b20b-a05bbe031a10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.649688 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69edd84b-b59a-4094-b20b-a05bbe031a10-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.694164 4651 generic.go:334] "Generic (PLEG): container finished" podID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerID="6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676" exitCode=0 Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.694237 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkcjt" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.694247 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerDied","Data":"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.694294 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkcjt" event={"ID":"962a3109-87ee-4cdb-9def-3676eb13e46a","Type":"ContainerDied","Data":"8bba5666ef6e9d436b1cdd562e1feb144b56adc586aa1e12d3968d892e2727b0"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.694317 4651 scope.go:117] "RemoveContainer" containerID="6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.706227 4651 generic.go:334] "Generic (PLEG): container finished" podID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerID="d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e" exitCode=0 Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.706289 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" event={"ID":"b1e02d51-3be7-4c15-9e50-f446bca05403","Type":"ContainerDied","Data":"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.706314 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" event={"ID":"b1e02d51-3be7-4c15-9e50-f446bca05403","Type":"ContainerDied","Data":"f3a2cd004e7307fa51f963c8bee460c22083fe214878ff9079e0d8328555b337"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.706407 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgc22" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.711020 4651 generic.go:334] "Generic (PLEG): container finished" podID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerID="2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3" exitCode=0 Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.711072 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerDied","Data":"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.711122 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qglps" event={"ID":"69edd84b-b59a-4094-b20b-a05bbe031a10","Type":"ContainerDied","Data":"973d1969a650eaca1b0b3d156626b8bad02c3daf6a7fc15c8bae642c1ba8290b"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.711426 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qglps" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.712794 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" event={"ID":"ab73482d-4b1c-481f-9728-36d8505e8a9b","Type":"ContainerStarted","Data":"1c87d28eabbebdb7aeb208c6f9c46245f2517dae8ae7a1b362b5fc085c0c7cee"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.720614 4651 generic.go:334] "Generic (PLEG): container finished" podID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerID="10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719" exitCode=0 Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.720783 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqpd2" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.721429 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerDied","Data":"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.721456 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqpd2" event={"ID":"e13d4652-4c12-40d7-bb77-edb7ce43bd47","Type":"ContainerDied","Data":"37f40655cd0d1e9a78fe70e617816500d55286cd76c306981778c9eb3f0bb5f9"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.729589 4651 generic.go:334] "Generic (PLEG): container finished" podID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerID="54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e" exitCode=0 Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.729624 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerDied","Data":"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.729640 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkzc8" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.729646 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkzc8" event={"ID":"495ceaad-8c5b-477a-9630-21fdad21a5da","Type":"ContainerDied","Data":"3380af0360ccaf733c95b01f773be028d45106ae53eed47b953c2efe55a272ec"} Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.730207 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.734925 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bkcjt"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.748079 4651 scope.go:117] "RemoveContainer" containerID="18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.794767 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.795266 4651 scope.go:117] "RemoveContainer" containerID="1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.815172 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgc22"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.831850 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.840378 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkzc8"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.841953 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.847233 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqpd2"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.856364 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.857271 4651 scope.go:117] "RemoveContainer" containerID="6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.857707 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676\": container with ID starting with 6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676 not found: ID does not exist" containerID="6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.857744 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676"} err="failed to get container status \"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676\": rpc error: code = NotFound desc = could not find container \"6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676\": container with ID starting with 6394e812c6062bdc0014860e83eb6f76221e57478e2e03ad376a013297073676 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.857767 4651 scope.go:117] "RemoveContainer" containerID="18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.858339 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28\": container with ID starting with 18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28 not found: ID does not exist" containerID="18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.858367 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28"} err="failed to get container status \"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28\": rpc error: code = NotFound desc = could not find container \"18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28\": container with ID starting with 18e74a29b8e585830e31ce73f9d25a456f6eb7ef1967af533601c3676129af28 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.858381 4651 scope.go:117] "RemoveContainer" containerID="1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.858587 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a\": container with ID starting with 1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a not found: ID does not exist" containerID="1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.858613 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a"} err="failed to get container status \"1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a\": rpc error: code = NotFound desc = could not find container \"1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a\": container with ID starting with 1a40ed276383820d6f45ac153a1bfa5f87f96460780fb7f372c2aa183155010a not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.858633 4651 scope.go:117] "RemoveContainer" containerID="d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.863306 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qglps"] Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.875652 4651 scope.go:117] "RemoveContainer" containerID="d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.876494 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e\": container with ID starting with d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e not found: ID does not exist" containerID="d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.876532 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e"} err="failed to get container status \"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e\": rpc error: code = NotFound desc = could not find container \"d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e\": container with ID starting with d88bb6a6071e2a6079bb7bda0873087173689000d360e881f76336653488c71e not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.876558 4651 scope.go:117] "RemoveContainer" containerID="2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.889705 4651 scope.go:117] "RemoveContainer" containerID="6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.904743 4651 scope.go:117] "RemoveContainer" containerID="a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.927430 4651 scope.go:117] "RemoveContainer" containerID="2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.927942 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3\": container with ID starting with 2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3 not found: ID does not exist" containerID="2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.928053 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3"} err="failed to get container status \"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3\": rpc error: code = NotFound desc = could not find container \"2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3\": container with ID starting with 2306eda0cb54d9c4d826f5c1d5b37363f4040c698aaa2d9b7f6753f1771659e3 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.928148 4651 scope.go:117] "RemoveContainer" containerID="6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.930176 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf\": container with ID starting with 6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf not found: ID does not exist" containerID="6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.930224 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf"} err="failed to get container status \"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf\": rpc error: code = NotFound desc = could not find container \"6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf\": container with ID starting with 6ad8c8aa1920d74aa06d4e26ece4fd3a9872faacc545f955678732bc6843c7cf not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.930257 4651 scope.go:117] "RemoveContainer" containerID="a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.930718 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3\": container with ID starting with a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3 not found: ID does not exist" containerID="a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.930779 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3"} err="failed to get container status \"a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3\": rpc error: code = NotFound desc = could not find container \"a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3\": container with ID starting with a873d5e7f8b3ffbb063f2d18727fb07d5cba8c5e50bd65d2d0c24d0ae25453c3 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.930805 4651 scope.go:117] "RemoveContainer" containerID="10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.952093 4651 scope.go:117] "RemoveContainer" containerID="bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.974166 4651 scope.go:117] "RemoveContainer" containerID="9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.991127 4651 scope.go:117] "RemoveContainer" containerID="10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.991752 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719\": container with ID starting with 10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719 not found: ID does not exist" containerID="10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.991829 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719"} err="failed to get container status \"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719\": rpc error: code = NotFound desc = could not find container \"10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719\": container with ID starting with 10e1865a683ffc12a36d74cf368639318ae70d5a10901f0fee14170db9896719 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.991872 4651 scope.go:117] "RemoveContainer" containerID="bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.992278 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966\": container with ID starting with bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966 not found: ID does not exist" containerID="bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.992380 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966"} err="failed to get container status \"bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966\": rpc error: code = NotFound desc = could not find container \"bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966\": container with ID starting with bfa198cffb8c9d785cacf68319f18ab09ec09b17871324f3f4ff1c1fe0667966 not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.992538 4651 scope.go:117] "RemoveContainer" containerID="9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be" Nov 26 14:52:51 crc kubenswrapper[4651]: E1126 14:52:51.992870 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be\": container with ID starting with 9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be not found: ID does not exist" containerID="9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.992905 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be"} err="failed to get container status \"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be\": rpc error: code = NotFound desc = could not find container \"9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be\": container with ID starting with 9a735cf67e370c83dd52f86573da777f1b9009aaef33593f618f95f692deb9be not found: ID does not exist" Nov 26 14:52:51 crc kubenswrapper[4651]: I1126 14:52:51.992929 4651 scope.go:117] "RemoveContainer" containerID="54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.012913 4651 scope.go:117] "RemoveContainer" containerID="db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.024613 4651 scope.go:117] "RemoveContainer" containerID="91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.039932 4651 scope.go:117] "RemoveContainer" containerID="54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.040545 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e\": container with ID starting with 54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e not found: ID does not exist" containerID="54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.040581 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e"} err="failed to get container status \"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e\": rpc error: code = NotFound desc = could not find container \"54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e\": container with ID starting with 54621bc9f200137661bc902ee080cd9f01700090b4df0db04af9f2347d74a44e not found: ID does not exist" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.040614 4651 scope.go:117] "RemoveContainer" containerID="db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.040949 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1\": container with ID starting with db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1 not found: ID does not exist" containerID="db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.040992 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1"} err="failed to get container status \"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1\": rpc error: code = NotFound desc = could not find container \"db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1\": container with ID starting with db292c6467465eafc379b4c9d3902ff0c3bceaf0a616c89524799932c45cc6f1 not found: ID does not exist" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.041016 4651 scope.go:117] "RemoveContainer" containerID="91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.041660 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439\": container with ID starting with 91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439 not found: ID does not exist" containerID="91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.041690 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439"} err="failed to get container status \"91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439\": rpc error: code = NotFound desc = could not find container \"91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439\": container with ID starting with 91b9227cf6ee8fa6410e4b392fd8b24288047c8137e87c94975acbca476e9439 not found: ID does not exist" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292501 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z5pzs"] Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292786 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292797 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292818 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292825 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292837 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292843 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292854 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292863 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292871 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292878 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292892 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292899 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292910 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292916 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292925 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292931 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292943 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292949 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292962 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292968 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="extract-content" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292981 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.292990 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="extract-utilities" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.292997 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293003 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: E1126 14:52:52.293009 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293015 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293185 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293207 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293217 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" containerName="marketplace-operator" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293227 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.293235 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" containerName="registry-server" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.297562 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.306930 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5pzs"] Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.307274 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.359058 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-utilities\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.359142 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwjj\" (UniqueName: \"kubernetes.io/projected/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-kube-api-access-wpwjj\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.359210 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-catalog-content\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.460174 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-catalog-content\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.460248 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-utilities\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.460297 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwjj\" (UniqueName: \"kubernetes.io/projected/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-kube-api-access-wpwjj\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.460694 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-catalog-content\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.461436 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-utilities\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.479352 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwjj\" (UniqueName: \"kubernetes.io/projected/3b82390c-5e5e-4d38-91ea-7b1a1c3820d7-kube-api-access-wpwjj\") pod \"certified-operators-z5pzs\" (UID: \"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7\") " pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.614212 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.740404 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" event={"ID":"ab73482d-4b1c-481f-9728-36d8505e8a9b","Type":"ContainerStarted","Data":"13b296ec040d57fcd1584fa18821fa9e858e253171c562990a80584134253163"} Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.742247 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.747691 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" Nov 26 14:52:52 crc kubenswrapper[4651]: I1126 14:52:52.762620 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zhtwk" podStartSLOduration=2.762598792 podStartE2EDuration="2.762598792s" podCreationTimestamp="2025-11-26 14:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:52:52.758445252 +0000 UTC m=+140.184192876" watchObservedRunningTime="2025-11-26 14:52:52.762598792 +0000 UTC m=+140.188346396" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.047407 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5pzs"] Nov 26 14:52:53 crc kubenswrapper[4651]: W1126 14:52:53.056260 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b82390c_5e5e_4d38_91ea_7b1a1c3820d7.slice/crio-464d6929cd9243b2facbdee53a98bd6d8b402396df3065cb11acc73fa7cb803b WatchSource:0}: Error finding container 464d6929cd9243b2facbdee53a98bd6d8b402396df3065cb11acc73fa7cb803b: Status 404 returned error can't find the container with id 464d6929cd9243b2facbdee53a98bd6d8b402396df3065cb11acc73fa7cb803b Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.409455 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495ceaad-8c5b-477a-9630-21fdad21a5da" path="/var/lib/kubelet/pods/495ceaad-8c5b-477a-9630-21fdad21a5da/volumes" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.410203 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69edd84b-b59a-4094-b20b-a05bbe031a10" path="/var/lib/kubelet/pods/69edd84b-b59a-4094-b20b-a05bbe031a10/volumes" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.410794 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962a3109-87ee-4cdb-9def-3676eb13e46a" path="/var/lib/kubelet/pods/962a3109-87ee-4cdb-9def-3676eb13e46a/volumes" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.414152 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e02d51-3be7-4c15-9e50-f446bca05403" path="/var/lib/kubelet/pods/b1e02d51-3be7-4c15-9e50-f446bca05403/volumes" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.415602 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13d4652-4c12-40d7-bb77-edb7ce43bd47" path="/var/lib/kubelet/pods/e13d4652-4c12-40d7-bb77-edb7ce43bd47/volumes" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.681139 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vtwvs"] Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.682419 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.685000 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.685226 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-catalog-content\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.685289 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dklk\" (UniqueName: \"kubernetes.io/projected/31d40458-c747-413d-8a81-43906c765b3d-kube-api-access-5dklk\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.685313 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-utilities\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.696888 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtwvs"] Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.767708 4651 generic.go:334] "Generic (PLEG): container finished" podID="3b82390c-5e5e-4d38-91ea-7b1a1c3820d7" containerID="84fb3c3e839609a59392ad58a6c5ba66ff0093f283913859d5cadea744f8b6b9" exitCode=0 Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.767781 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5pzs" event={"ID":"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7","Type":"ContainerDied","Data":"84fb3c3e839609a59392ad58a6c5ba66ff0093f283913859d5cadea744f8b6b9"} Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.767814 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5pzs" event={"ID":"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7","Type":"ContainerStarted","Data":"464d6929cd9243b2facbdee53a98bd6d8b402396df3065cb11acc73fa7cb803b"} Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.786802 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dklk\" (UniqueName: \"kubernetes.io/projected/31d40458-c747-413d-8a81-43906c765b3d-kube-api-access-5dklk\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.786984 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-utilities\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.787051 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-catalog-content\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.787548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-catalog-content\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.787857 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31d40458-c747-413d-8a81-43906c765b3d-utilities\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:53 crc kubenswrapper[4651]: I1126 14:52:53.804894 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dklk\" (UniqueName: \"kubernetes.io/projected/31d40458-c747-413d-8a81-43906c765b3d-kube-api-access-5dklk\") pod \"redhat-marketplace-vtwvs\" (UID: \"31d40458-c747-413d-8a81-43906c765b3d\") " pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.007590 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.382802 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtwvs"] Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.707620 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzt4w"] Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.714006 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzt4w"] Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.714136 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.716298 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.774979 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5pzs" event={"ID":"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7","Type":"ContainerStarted","Data":"e665224474ab1bb00303e4d86c0724dc6a321583dac7bf1e88dba49e873ab9b8"} Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.776759 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtwvs" event={"ID":"31d40458-c747-413d-8a81-43906c765b3d","Type":"ContainerDied","Data":"7335e0c8e32484b3ee812e0e8d3649c4c72772165ba710fce2ae33a084c0fa3a"} Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.777330 4651 generic.go:334] "Generic (PLEG): container finished" podID="31d40458-c747-413d-8a81-43906c765b3d" containerID="7335e0c8e32484b3ee812e0e8d3649c4c72772165ba710fce2ae33a084c0fa3a" exitCode=0 Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.777447 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtwvs" event={"ID":"31d40458-c747-413d-8a81-43906c765b3d","Type":"ContainerStarted","Data":"b0d1c48d669ba35b9a244d71d3c70ebfaae2d767ba34ba3e11e00572f936a86e"} Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.804118 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-utilities\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.804389 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtvm\" (UniqueName: \"kubernetes.io/projected/5db8f2dc-6548-4271-9e83-27e1fc8ab069-kube-api-access-5dtvm\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.804488 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-catalog-content\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.905195 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-catalog-content\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.905266 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-utilities\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.905299 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtvm\" (UniqueName: \"kubernetes.io/projected/5db8f2dc-6548-4271-9e83-27e1fc8ab069-kube-api-access-5dtvm\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.905944 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-catalog-content\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.906145 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5db8f2dc-6548-4271-9e83-27e1fc8ab069-utilities\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:54 crc kubenswrapper[4651]: I1126 14:52:54.922230 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtvm\" (UniqueName: \"kubernetes.io/projected/5db8f2dc-6548-4271-9e83-27e1fc8ab069-kube-api-access-5dtvm\") pod \"redhat-operators-qzt4w\" (UID: \"5db8f2dc-6548-4271-9e83-27e1fc8ab069\") " pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.050412 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.477150 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzt4w"] Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.785267 4651 generic.go:334] "Generic (PLEG): container finished" podID="5db8f2dc-6548-4271-9e83-27e1fc8ab069" containerID="cf007c3b4055ee4b5b934bce3b526ad4217a1ab616a65e34c6c02d6cd21489a7" exitCode=0 Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.785455 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzt4w" event={"ID":"5db8f2dc-6548-4271-9e83-27e1fc8ab069","Type":"ContainerDied","Data":"cf007c3b4055ee4b5b934bce3b526ad4217a1ab616a65e34c6c02d6cd21489a7"} Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.785666 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzt4w" event={"ID":"5db8f2dc-6548-4271-9e83-27e1fc8ab069","Type":"ContainerStarted","Data":"16b23bf17bdad8041fa8a353842c3bfd9688ef690921cac7aee8b9d8bc791a34"} Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.789310 4651 generic.go:334] "Generic (PLEG): container finished" podID="3b82390c-5e5e-4d38-91ea-7b1a1c3820d7" containerID="e665224474ab1bb00303e4d86c0724dc6a321583dac7bf1e88dba49e873ab9b8" exitCode=0 Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.789387 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5pzs" event={"ID":"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7","Type":"ContainerDied","Data":"e665224474ab1bb00303e4d86c0724dc6a321583dac7bf1e88dba49e873ab9b8"} Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.791896 4651 generic.go:334] "Generic (PLEG): container finished" podID="31d40458-c747-413d-8a81-43906c765b3d" containerID="51d61f88654c04ed353277943256e5cda0a374e2da1911852c98f22aa50ce153" exitCode=0 Nov 26 14:52:55 crc kubenswrapper[4651]: I1126 14:52:55.791923 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtwvs" event={"ID":"31d40458-c747-413d-8a81-43906c765b3d","Type":"ContainerDied","Data":"51d61f88654c04ed353277943256e5cda0a374e2da1911852c98f22aa50ce153"} Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.085345 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hsg6c"] Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.086463 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.090257 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.099623 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsg6c"] Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.254028 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-catalog-content\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.254090 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-utilities\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.254604 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tvfz\" (UniqueName: \"kubernetes.io/projected/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-kube-api-access-9tvfz\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.355663 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tvfz\" (UniqueName: \"kubernetes.io/projected/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-kube-api-access-9tvfz\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.355767 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-catalog-content\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.355786 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-utilities\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.356273 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-utilities\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.356328 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-catalog-content\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.373019 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tvfz\" (UniqueName: \"kubernetes.io/projected/999d25b1-0acb-4cb4-bb7b-f65d770cf7e6-kube-api-access-9tvfz\") pod \"community-operators-hsg6c\" (UID: \"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6\") " pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.410794 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.818328 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5pzs" event={"ID":"3b82390c-5e5e-4d38-91ea-7b1a1c3820d7","Type":"ContainerStarted","Data":"68ec0256ec1b2bf61fa500f8e31572f08401134d1b7d68232edfb85c3f9a177b"} Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.842398 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z5pzs" podStartSLOduration=2.232042896 podStartE2EDuration="4.842377412s" podCreationTimestamp="2025-11-26 14:52:52 +0000 UTC" firstStartedPulling="2025-11-26 14:52:53.770900014 +0000 UTC m=+141.196647618" lastFinishedPulling="2025-11-26 14:52:56.38123453 +0000 UTC m=+143.806982134" observedRunningTime="2025-11-26 14:52:56.840097035 +0000 UTC m=+144.265844659" watchObservedRunningTime="2025-11-26 14:52:56.842377412 +0000 UTC m=+144.268125016" Nov 26 14:52:56 crc kubenswrapper[4651]: W1126 14:52:56.851187 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999d25b1_0acb_4cb4_bb7b_f65d770cf7e6.slice/crio-a2c1e14e18b79cfa7bd6f7db558275886fa680472a83e92314c99e082b0adbd0 WatchSource:0}: Error finding container a2c1e14e18b79cfa7bd6f7db558275886fa680472a83e92314c99e082b0adbd0: Status 404 returned error can't find the container with id a2c1e14e18b79cfa7bd6f7db558275886fa680472a83e92314c99e082b0adbd0 Nov 26 14:52:56 crc kubenswrapper[4651]: I1126 14:52:56.856724 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsg6c"] Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.824636 4651 generic.go:334] "Generic (PLEG): container finished" podID="999d25b1-0acb-4cb4-bb7b-f65d770cf7e6" containerID="564df749b3e10c74345a810f420a1210ab902660969a767b113f52637477587c" exitCode=0 Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.824696 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsg6c" event={"ID":"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6","Type":"ContainerDied","Data":"564df749b3e10c74345a810f420a1210ab902660969a767b113f52637477587c"} Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.825060 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsg6c" event={"ID":"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6","Type":"ContainerStarted","Data":"a2c1e14e18b79cfa7bd6f7db558275886fa680472a83e92314c99e082b0adbd0"} Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.828479 4651 generic.go:334] "Generic (PLEG): container finished" podID="5db8f2dc-6548-4271-9e83-27e1fc8ab069" containerID="43998eed4d3740ce6a15034bd29c48ecdedb3eb8693c02cdeb232bcc31adfbd6" exitCode=0 Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.828544 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzt4w" event={"ID":"5db8f2dc-6548-4271-9e83-27e1fc8ab069","Type":"ContainerDied","Data":"43998eed4d3740ce6a15034bd29c48ecdedb3eb8693c02cdeb232bcc31adfbd6"} Nov 26 14:52:57 crc kubenswrapper[4651]: I1126 14:52:57.838002 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtwvs" event={"ID":"31d40458-c747-413d-8a81-43906c765b3d","Type":"ContainerStarted","Data":"68cd8e4eb9b15d6db8dc52d732a5e2fba6f5cae66ef0c6dc9d64b312838cf1f7"} Nov 26 14:52:58 crc kubenswrapper[4651]: I1126 14:52:58.851195 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzt4w" event={"ID":"5db8f2dc-6548-4271-9e83-27e1fc8ab069","Type":"ContainerStarted","Data":"8a83d4e5b2d079179b0698a08d3c3c88330f7255ed2ba8b8ee191b65fa9abe6e"} Nov 26 14:52:58 crc kubenswrapper[4651]: I1126 14:52:58.867706 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzt4w" podStartSLOduration=2.305061727 podStartE2EDuration="4.86768778s" podCreationTimestamp="2025-11-26 14:52:54 +0000 UTC" firstStartedPulling="2025-11-26 14:52:55.791821684 +0000 UTC m=+143.217569288" lastFinishedPulling="2025-11-26 14:52:58.354447737 +0000 UTC m=+145.780195341" observedRunningTime="2025-11-26 14:52:58.866163753 +0000 UTC m=+146.291911377" watchObservedRunningTime="2025-11-26 14:52:58.86768778 +0000 UTC m=+146.293435394" Nov 26 14:52:58 crc kubenswrapper[4651]: I1126 14:52:58.870429 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vtwvs" podStartSLOduration=3.9622100810000003 podStartE2EDuration="5.870414455s" podCreationTimestamp="2025-11-26 14:52:53 +0000 UTC" firstStartedPulling="2025-11-26 14:52:54.778056253 +0000 UTC m=+142.203803857" lastFinishedPulling="2025-11-26 14:52:56.686260627 +0000 UTC m=+144.112008231" observedRunningTime="2025-11-26 14:52:57.895395354 +0000 UTC m=+145.321142958" watchObservedRunningTime="2025-11-26 14:52:58.870414455 +0000 UTC m=+146.296162079" Nov 26 14:52:59 crc kubenswrapper[4651]: I1126 14:52:59.135624 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:52:59 crc kubenswrapper[4651]: I1126 14:52:59.135679 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:53:02 crc kubenswrapper[4651]: I1126 14:53:02.614569 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:53:02 crc kubenswrapper[4651]: I1126 14:53:02.614940 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:53:02 crc kubenswrapper[4651]: I1126 14:53:02.659541 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:53:02 crc kubenswrapper[4651]: I1126 14:53:02.908603 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z5pzs" Nov 26 14:53:03 crc kubenswrapper[4651]: I1126 14:53:03.881247 4651 generic.go:334] "Generic (PLEG): container finished" podID="999d25b1-0acb-4cb4-bb7b-f65d770cf7e6" containerID="56ceb200d500a8adc096425ac8881f30365084f5d649debb1bd257cfd37882f4" exitCode=0 Nov 26 14:53:03 crc kubenswrapper[4651]: I1126 14:53:03.882457 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsg6c" event={"ID":"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6","Type":"ContainerDied","Data":"56ceb200d500a8adc096425ac8881f30365084f5d649debb1bd257cfd37882f4"} Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.008844 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.008916 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.047408 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.888552 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsg6c" event={"ID":"999d25b1-0acb-4cb4-bb7b-f65d770cf7e6","Type":"ContainerStarted","Data":"c3fc720a68d87cb78915f8ced9278d3c4b6966f2a57062ec54aa967c936a5fe4"} Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.933948 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vtwvs" Nov 26 14:53:04 crc kubenswrapper[4651]: I1126 14:53:04.960688 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hsg6c" podStartSLOduration=2.351966502 podStartE2EDuration="8.960668067s" podCreationTimestamp="2025-11-26 14:52:56 +0000 UTC" firstStartedPulling="2025-11-26 14:52:57.828025969 +0000 UTC m=+145.253773573" lastFinishedPulling="2025-11-26 14:53:04.436727534 +0000 UTC m=+151.862475138" observedRunningTime="2025-11-26 14:53:04.911646064 +0000 UTC m=+152.337393698" watchObservedRunningTime="2025-11-26 14:53:04.960668067 +0000 UTC m=+152.386415671" Nov 26 14:53:05 crc kubenswrapper[4651]: I1126 14:53:05.051738 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:53:05 crc kubenswrapper[4651]: I1126 14:53:05.051774 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:53:05 crc kubenswrapper[4651]: I1126 14:53:05.087284 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:53:05 crc kubenswrapper[4651]: I1126 14:53:05.935133 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzt4w" Nov 26 14:53:06 crc kubenswrapper[4651]: I1126 14:53:06.410949 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:53:06 crc kubenswrapper[4651]: I1126 14:53:06.411176 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:53:07 crc kubenswrapper[4651]: I1126 14:53:07.460173 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hsg6c" podUID="999d25b1-0acb-4cb4-bb7b-f65d770cf7e6" containerName="registry-server" probeResult="failure" output=< Nov 26 14:53:07 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 14:53:07 crc kubenswrapper[4651]: > Nov 26 14:53:16 crc kubenswrapper[4651]: I1126 14:53:16.459717 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:53:16 crc kubenswrapper[4651]: I1126 14:53:16.522896 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hsg6c" Nov 26 14:53:29 crc kubenswrapper[4651]: I1126 14:53:29.132852 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:53:29 crc kubenswrapper[4651]: I1126 14:53:29.133488 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:53:59 crc kubenswrapper[4651]: I1126 14:53:59.133280 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:53:59 crc kubenswrapper[4651]: I1126 14:53:59.134705 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:53:59 crc kubenswrapper[4651]: I1126 14:53:59.134842 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:53:59 crc kubenswrapper[4651]: I1126 14:53:59.135583 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:53:59 crc kubenswrapper[4651]: I1126 14:53:59.135849 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234" gracePeriod=600 Nov 26 14:54:00 crc kubenswrapper[4651]: I1126 14:54:00.201363 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234" exitCode=0 Nov 26 14:54:00 crc kubenswrapper[4651]: I1126 14:54:00.201471 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234"} Nov 26 14:54:00 crc kubenswrapper[4651]: I1126 14:54:00.201883 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197"} Nov 26 14:55:59 crc kubenswrapper[4651]: I1126 14:55:59.132933 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:55:59 crc kubenswrapper[4651]: I1126 14:55:59.133808 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.009896 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2s2qj"] Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.010927 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.050485 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2s2qj"] Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093137 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-bound-sa-token\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093203 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093231 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-tls\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093259 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093295 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlcn\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-kube-api-access-jjlcn\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093317 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093335 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-trusted-ca\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.093353 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-certificates\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.187349 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194176 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194222 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-tls\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194300 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlcn\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-kube-api-access-jjlcn\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194331 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194353 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-trusted-ca\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194384 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-certificates\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.194416 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-bound-sa-token\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.195504 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.195751 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-trusted-ca\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.196697 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-certificates\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.210339 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.212486 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-bound-sa-token\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.212904 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlcn\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-kube-api-access-jjlcn\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.213916 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9-registry-tls\") pod \"image-registry-66df7c8f76-2s2qj\" (UID: \"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9\") " pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.343460 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.547292 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2s2qj"] Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.932635 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" event={"ID":"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9","Type":"ContainerStarted","Data":"55ee9494a6bcca1d19240d113a04a8cb71e645a3fba7b06a03bcae7f242777c0"} Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.932689 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" event={"ID":"6e4ad518-5d99-454a-a4b6-6ab9c8ef8fd9","Type":"ContainerStarted","Data":"4135037be2a6a25eb8ff23a2b6b17798eff9fbf2836df6e5fe5e08c58116375d"} Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.933050 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:15 crc kubenswrapper[4651]: I1126 14:56:15.949340 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" podStartSLOduration=1.949321416 podStartE2EDuration="1.949321416s" podCreationTimestamp="2025-11-26 14:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:56:15.947925327 +0000 UTC m=+343.373672941" watchObservedRunningTime="2025-11-26 14:56:15.949321416 +0000 UTC m=+343.375069020" Nov 26 14:56:29 crc kubenswrapper[4651]: I1126 14:56:29.132989 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:56:29 crc kubenswrapper[4651]: I1126 14:56:29.133811 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:56:35 crc kubenswrapper[4651]: I1126 14:56:35.354704 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2s2qj" Nov 26 14:56:35 crc kubenswrapper[4651]: I1126 14:56:35.433880 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:56:59 crc kubenswrapper[4651]: I1126 14:56:59.137703 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:56:59 crc kubenswrapper[4651]: I1126 14:56:59.139158 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:56:59 crc kubenswrapper[4651]: I1126 14:56:59.139222 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:56:59 crc kubenswrapper[4651]: I1126 14:56:59.139800 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:56:59 crc kubenswrapper[4651]: I1126 14:56:59.139860 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197" gracePeriod=600 Nov 26 14:57:00 crc kubenswrapper[4651]: I1126 14:57:00.165764 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197" exitCode=0 Nov 26 14:57:00 crc kubenswrapper[4651]: I1126 14:57:00.166296 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197"} Nov 26 14:57:00 crc kubenswrapper[4651]: I1126 14:57:00.166355 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190"} Nov 26 14:57:00 crc kubenswrapper[4651]: I1126 14:57:00.166373 4651 scope.go:117] "RemoveContainer" containerID="04c4e1b60bd0f3f1f0ee1ad045adcf48c93c1df3028087c0c63c1fc18ffe7234" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.089513 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" podUID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" containerName="registry" containerID="cri-o://2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73" gracePeriod=30 Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.919989 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.946695 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.946797 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.946829 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.946857 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.947077 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.947129 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.947164 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.947183 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjk9n\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n\") pod \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\" (UID: \"a6ff5e03-1863-4dad-bc3a-9c21d0521b17\") " Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.948844 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.948907 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.959172 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.959299 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.959469 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.959639 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n" (OuterVolumeSpecName: "kube-api-access-xjk9n") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "kube-api-access-xjk9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:57:01 crc kubenswrapper[4651]: I1126 14:57:01.968107 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.015525 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a6ff5e03-1863-4dad-bc3a-9c21d0521b17" (UID: "a6ff5e03-1863-4dad-bc3a-9c21d0521b17"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048087 4651 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048117 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048128 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjk9n\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-kube-api-access-xjk9n\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048138 4651 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048147 4651 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048155 4651 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.048163 4651 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6ff5e03-1863-4dad-bc3a-9c21d0521b17-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.182397 4651 generic.go:334] "Generic (PLEG): container finished" podID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" containerID="2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73" exitCode=0 Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.182457 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" event={"ID":"a6ff5e03-1863-4dad-bc3a-9c21d0521b17","Type":"ContainerDied","Data":"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73"} Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.182548 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" event={"ID":"a6ff5e03-1863-4dad-bc3a-9c21d0521b17","Type":"ContainerDied","Data":"47729edd6ceb4e91de49632deb5aefc4016e22dd0225e752ac56c24f4828dce5"} Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.182602 4651 scope.go:117] "RemoveContainer" containerID="2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.182734 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bb2l7" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.209962 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.213815 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bb2l7"] Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.222342 4651 scope.go:117] "RemoveContainer" containerID="2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73" Nov 26 14:57:02 crc kubenswrapper[4651]: E1126 14:57:02.222784 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73\": container with ID starting with 2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73 not found: ID does not exist" containerID="2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73" Nov 26 14:57:02 crc kubenswrapper[4651]: I1126 14:57:02.222819 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73"} err="failed to get container status \"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73\": rpc error: code = NotFound desc = could not find container \"2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73\": container with ID starting with 2b9ea4b5b2662d88f8f163bdc62391bce8c9f44642f0bc7217c34ee08476aa73 not found: ID does not exist" Nov 26 14:57:03 crc kubenswrapper[4651]: I1126 14:57:03.419734 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" path="/var/lib/kubelet/pods/a6ff5e03-1863-4dad-bc3a-9c21d0521b17/volumes" Nov 26 14:58:59 crc kubenswrapper[4651]: I1126 14:58:59.132858 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:58:59 crc kubenswrapper[4651]: I1126 14:58:59.133339 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.346511 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rnxlv"] Nov 26 14:59:11 crc kubenswrapper[4651]: E1126 14:59:11.347225 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" containerName="registry" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.347242 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" containerName="registry" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.347343 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ff5e03-1863-4dad-bc3a-9c21d0521b17" containerName="registry" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.347688 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.352939 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.353509 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jw7t8" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.356702 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.376242 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qksf2"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.376861 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qksf2" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.381435 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vnccs" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.396364 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qksf2"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.400581 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-wvl8s"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.401420 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.405387 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9p5j9" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.412822 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rnxlv"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.416214 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-wvl8s"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.450577 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77d8p\" (UniqueName: \"kubernetes.io/projected/38d33193-a6f6-42e3-982d-59abd72e12f2-kube-api-access-77d8p\") pod \"cert-manager-cainjector-7f985d654d-rnxlv\" (UID: \"38d33193-a6f6-42e3-982d-59abd72e12f2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.551539 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmcc\" (UniqueName: \"kubernetes.io/projected/00fbc8fa-7a28-4489-9e75-5c309cc83d87-kube-api-access-znmcc\") pod \"cert-manager-5b446d88c5-qksf2\" (UID: \"00fbc8fa-7a28-4489-9e75-5c309cc83d87\") " pod="cert-manager/cert-manager-5b446d88c5-qksf2" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.551602 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmsjq\" (UniqueName: \"kubernetes.io/projected/0ff4812f-612b-438b-8a92-7a9ff86bcdda-kube-api-access-zmsjq\") pod \"cert-manager-webhook-5655c58dd6-wvl8s\" (UID: \"0ff4812f-612b-438b-8a92-7a9ff86bcdda\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.551723 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77d8p\" (UniqueName: \"kubernetes.io/projected/38d33193-a6f6-42e3-982d-59abd72e12f2-kube-api-access-77d8p\") pod \"cert-manager-cainjector-7f985d654d-rnxlv\" (UID: \"38d33193-a6f6-42e3-982d-59abd72e12f2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.572262 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77d8p\" (UniqueName: \"kubernetes.io/projected/38d33193-a6f6-42e3-982d-59abd72e12f2-kube-api-access-77d8p\") pod \"cert-manager-cainjector-7f985d654d-rnxlv\" (UID: \"38d33193-a6f6-42e3-982d-59abd72e12f2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.652632 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmcc\" (UniqueName: \"kubernetes.io/projected/00fbc8fa-7a28-4489-9e75-5c309cc83d87-kube-api-access-znmcc\") pod \"cert-manager-5b446d88c5-qksf2\" (UID: \"00fbc8fa-7a28-4489-9e75-5c309cc83d87\") " pod="cert-manager/cert-manager-5b446d88c5-qksf2" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.652973 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmsjq\" (UniqueName: \"kubernetes.io/projected/0ff4812f-612b-438b-8a92-7a9ff86bcdda-kube-api-access-zmsjq\") pod \"cert-manager-webhook-5655c58dd6-wvl8s\" (UID: \"0ff4812f-612b-438b-8a92-7a9ff86bcdda\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.663622 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.677170 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmcc\" (UniqueName: \"kubernetes.io/projected/00fbc8fa-7a28-4489-9e75-5c309cc83d87-kube-api-access-znmcc\") pod \"cert-manager-5b446d88c5-qksf2\" (UID: \"00fbc8fa-7a28-4489-9e75-5c309cc83d87\") " pod="cert-manager/cert-manager-5b446d88c5-qksf2" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.680963 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmsjq\" (UniqueName: \"kubernetes.io/projected/0ff4812f-612b-438b-8a92-7a9ff86bcdda-kube-api-access-zmsjq\") pod \"cert-manager-webhook-5655c58dd6-wvl8s\" (UID: \"0ff4812f-612b-438b-8a92-7a9ff86bcdda\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.691615 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qksf2" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.714532 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.946787 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-rnxlv"] Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.960203 4651 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 14:59:11 crc kubenswrapper[4651]: I1126 14:59:11.983546 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qksf2"] Nov 26 14:59:12 crc kubenswrapper[4651]: I1126 14:59:12.021345 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-wvl8s"] Nov 26 14:59:12 crc kubenswrapper[4651]: I1126 14:59:12.926369 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" event={"ID":"38d33193-a6f6-42e3-982d-59abd72e12f2","Type":"ContainerStarted","Data":"b2a03ab8800a91b61ab5f3c33a7fbb78f133235fb37296889d3907cbbbd7f7d2"} Nov 26 14:59:12 crc kubenswrapper[4651]: I1126 14:59:12.929517 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qksf2" event={"ID":"00fbc8fa-7a28-4489-9e75-5c309cc83d87","Type":"ContainerStarted","Data":"5b45cd558ef0f6804272432dbb081118066641db12f6bf5fbc7a6d71df6e5d16"} Nov 26 14:59:12 crc kubenswrapper[4651]: I1126 14:59:12.930596 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" event={"ID":"0ff4812f-612b-438b-8a92-7a9ff86bcdda","Type":"ContainerStarted","Data":"9406c4cde11a701baeac43efd7399885174197b276c018a1fe6655d8281a5fd9"} Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.956249 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" event={"ID":"0ff4812f-612b-438b-8a92-7a9ff86bcdda","Type":"ContainerStarted","Data":"331c94e93d2b7697e267089a7ce1f8e97c37efec7830b5352a60b1ec5cc89773"} Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.957022 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.957591 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" event={"ID":"38d33193-a6f6-42e3-982d-59abd72e12f2","Type":"ContainerStarted","Data":"716ce6254e983d3a9656b9a7e3beda0fca61a62d7307b6e3f53521e46b729636"} Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.959887 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qksf2" event={"ID":"00fbc8fa-7a28-4489-9e75-5c309cc83d87","Type":"ContainerStarted","Data":"abbbdd8f778f44a8ed313e1f85f1277815e19cd678f0ef0e70894f4cc0ab0728"} Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.978699 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" podStartSLOduration=1.988703015 podStartE2EDuration="5.978640109s" podCreationTimestamp="2025-11-26 14:59:11 +0000 UTC" firstStartedPulling="2025-11-26 14:59:12.029974132 +0000 UTC m=+519.455721736" lastFinishedPulling="2025-11-26 14:59:16.019911216 +0000 UTC m=+523.445658830" observedRunningTime="2025-11-26 14:59:16.970338753 +0000 UTC m=+524.396086377" watchObservedRunningTime="2025-11-26 14:59:16.978640109 +0000 UTC m=+524.404387763" Nov 26 14:59:16 crc kubenswrapper[4651]: I1126 14:59:16.996364 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qksf2" podStartSLOduration=1.965161084 podStartE2EDuration="5.996339338s" podCreationTimestamp="2025-11-26 14:59:11 +0000 UTC" firstStartedPulling="2025-11-26 14:59:11.987406597 +0000 UTC m=+519.413154201" lastFinishedPulling="2025-11-26 14:59:16.018584851 +0000 UTC m=+523.444332455" observedRunningTime="2025-11-26 14:59:16.992747491 +0000 UTC m=+524.418495115" watchObservedRunningTime="2025-11-26 14:59:16.996339338 +0000 UTC m=+524.422086982" Nov 26 14:59:17 crc kubenswrapper[4651]: I1126 14:59:17.057230 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-rnxlv" podStartSLOduration=1.993228147 podStartE2EDuration="6.05721101s" podCreationTimestamp="2025-11-26 14:59:11 +0000 UTC" firstStartedPulling="2025-11-26 14:59:11.959911832 +0000 UTC m=+519.385659436" lastFinishedPulling="2025-11-26 14:59:16.023894695 +0000 UTC m=+523.449642299" observedRunningTime="2025-11-26 14:59:17.054586799 +0000 UTC m=+524.480334403" watchObservedRunningTime="2025-11-26 14:59:17.05721101 +0000 UTC m=+524.482958614" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.596143 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mmgnh"] Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.596923 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-controller" containerID="cri-o://8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597101 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-acl-logging" containerID="cri-o://5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597112 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-node" containerID="cri-o://326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597219 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="sbdb" containerID="cri-o://6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597266 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597263 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="northd" containerID="cri-o://3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.597029 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="nbdb" containerID="cri-o://bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.635845 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovnkube-controller" containerID="cri-o://60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" gracePeriod=30 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.725876 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-wvl8s" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.947013 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mmgnh_e9ee7939-7a21-4f3a-b534-056415581b10/ovn-acl-logging/0.log" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.947537 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mmgnh_e9ee7939-7a21-4f3a-b534-056415581b10/ovn-controller/0.log" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.947992 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.985660 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c884v_88feea33-aa22-45e0-9066-e40e92590ca5/kube-multus/0.log" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.985707 4651 generic.go:334] "Generic (PLEG): container finished" podID="88feea33-aa22-45e0-9066-e40e92590ca5" containerID="43ad7c819f1d281b57f0f77e053e23c095780a8db7e169d72abc767a922fcfd8" exitCode=2 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.985770 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c884v" event={"ID":"88feea33-aa22-45e0-9066-e40e92590ca5","Type":"ContainerDied","Data":"43ad7c819f1d281b57f0f77e053e23c095780a8db7e169d72abc767a922fcfd8"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.986261 4651 scope.go:117] "RemoveContainer" containerID="43ad7c819f1d281b57f0f77e053e23c095780a8db7e169d72abc767a922fcfd8" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.990612 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mmgnh_e9ee7939-7a21-4f3a-b534-056415581b10/ovn-acl-logging/0.log" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991272 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mmgnh_e9ee7939-7a21-4f3a-b534-056415581b10/ovn-controller/0.log" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991676 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991699 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991710 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991725 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991739 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991747 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" exitCode=0 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991755 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" exitCode=143 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991763 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9ee7939-7a21-4f3a-b534-056415581b10" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" exitCode=143 Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991777 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991790 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991826 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991843 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991857 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991870 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991885 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991898 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991911 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991920 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991929 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991955 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991964 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991971 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991978 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991985 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991993 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.991999 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992007 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992014 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992023 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992052 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992062 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992069 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992076 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992084 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992092 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992099 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992106 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992113 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992123 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mmgnh" event={"ID":"e9ee7939-7a21-4f3a-b534-056415581b10","Type":"ContainerDied","Data":"640be5f08b7fc2d1c69bf3c5824b5f712a8025db0e667a4024c8006add07c847"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992135 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992143 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992149 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992157 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992163 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992170 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992176 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992177 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992183 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} Nov 26 14:59:21 crc kubenswrapper[4651]: I1126 14:59:21.992291 4651 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.001468 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sjdnj"] Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002016 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="northd" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002082 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="northd" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002098 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002106 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002125 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="nbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002132 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="nbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002140 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-node" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002147 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-node" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002155 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="sbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002161 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="sbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002169 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kubecfg-setup" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002176 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kubecfg-setup" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002185 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002191 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002201 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-acl-logging" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002209 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-acl-logging" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.002221 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovnkube-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002227 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovnkube-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002360 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="sbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002373 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002380 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002389 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovnkube-controller" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002407 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="kube-rbac-proxy-node" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002414 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="ovn-acl-logging" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002422 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="nbdb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.002431 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" containerName="northd" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.007148 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.026738 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.042941 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.055791 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.076283 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.090606 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102206 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102238 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102542 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102573 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102875 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102920 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nm9k\" (UniqueName: \"kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102945 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.102975 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103002 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103017 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103048 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103068 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103085 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103103 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103125 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103145 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103169 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103193 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log" (OuterVolumeSpecName: "node-log") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103222 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103238 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103576 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103900 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103937 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103946 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103963 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash" (OuterVolumeSpecName: "host-slash") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103978 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103988 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket" (OuterVolumeSpecName: "log-socket") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103279 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.103180 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104129 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104153 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104176 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104193 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch\") pod \"e9ee7939-7a21-4f3a-b534-056415581b10\" (UID: \"e9ee7939-7a21-4f3a-b534-056415581b10\") " Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104191 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104243 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104329 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104350 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104939 4651 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-log-socket\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104958 4651 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104969 4651 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104980 4651 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104989 4651 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-node-log\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.104999 4651 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105010 4651 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105020 4651 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105029 4651 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105167 4651 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105178 4651 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105189 4651 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105199 4651 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105209 4651 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105218 4651 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9ee7939-7a21-4f3a-b534-056415581b10-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105228 4651 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.105237 4651 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-host-slash\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.109859 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k" (OuterVolumeSpecName: "kube-api-access-5nm9k") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "kube-api-access-5nm9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.109998 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.110890 4651 scope.go:117] "RemoveContainer" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.119801 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e9ee7939-7a21-4f3a-b534-056415581b10" (UID: "e9ee7939-7a21-4f3a-b534-056415581b10"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.126094 4651 scope.go:117] "RemoveContainer" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.145082 4651 scope.go:117] "RemoveContainer" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.159842 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.160396 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.160434 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} err="failed to get container status \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.160461 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.160935 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.160969 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} err="failed to get container status \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.161020 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.161589 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.161666 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} err="failed to get container status \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.161702 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.162042 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162071 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} err="failed to get container status \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162085 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.162378 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162407 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} err="failed to get container status \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162425 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.162895 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162922 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} err="failed to get container status \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.162936 4651 scope.go:117] "RemoveContainer" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.163230 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": container with ID starting with 5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a not found: ID does not exist" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.163261 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} err="failed to get container status \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": rpc error: code = NotFound desc = could not find container \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": container with ID starting with 5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.163281 4651 scope.go:117] "RemoveContainer" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.163696 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": container with ID starting with 8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111 not found: ID does not exist" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.163736 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} err="failed to get container status \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": rpc error: code = NotFound desc = could not find container \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": container with ID starting with 8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.163749 4651 scope.go:117] "RemoveContainer" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: E1126 14:59:22.163985 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": container with ID starting with 3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5 not found: ID does not exist" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164007 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} err="failed to get container status \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": rpc error: code = NotFound desc = could not find container \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": container with ID starting with 3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164021 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164459 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} err="failed to get container status \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164478 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164728 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} err="failed to get container status \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164751 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.164985 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} err="failed to get container status \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165008 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165374 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} err="failed to get container status \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165396 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165608 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} err="failed to get container status \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165625 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165830 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} err="failed to get container status \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.165853 4651 scope.go:117] "RemoveContainer" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.166134 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} err="failed to get container status \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": rpc error: code = NotFound desc = could not find container \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": container with ID starting with 5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.166151 4651 scope.go:117] "RemoveContainer" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.166706 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} err="failed to get container status \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": rpc error: code = NotFound desc = could not find container \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": container with ID starting with 8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.166730 4651 scope.go:117] "RemoveContainer" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167077 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} err="failed to get container status \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": rpc error: code = NotFound desc = could not find container \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": container with ID starting with 3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167098 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167324 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} err="failed to get container status \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167342 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167718 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} err="failed to get container status \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.167762 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168016 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} err="failed to get container status \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168053 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168334 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} err="failed to get container status \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168352 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168572 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} err="failed to get container status \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168588 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168899 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} err="failed to get container status \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.168947 4651 scope.go:117] "RemoveContainer" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169340 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} err="failed to get container status \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": rpc error: code = NotFound desc = could not find container \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": container with ID starting with 5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169368 4651 scope.go:117] "RemoveContainer" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169636 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} err="failed to get container status \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": rpc error: code = NotFound desc = could not find container \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": container with ID starting with 8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169660 4651 scope.go:117] "RemoveContainer" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169930 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} err="failed to get container status \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": rpc error: code = NotFound desc = could not find container \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": container with ID starting with 3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.169955 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.170275 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} err="failed to get container status \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.170325 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.170671 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} err="failed to get container status \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.170764 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171011 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} err="failed to get container status \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171028 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171338 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} err="failed to get container status \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171362 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171694 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} err="failed to get container status \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171716 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171958 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} err="failed to get container status \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.171976 4651 scope.go:117] "RemoveContainer" containerID="5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172274 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a"} err="failed to get container status \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": rpc error: code = NotFound desc = could not find container \"5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a\": container with ID starting with 5ddcc1de3a803d3de574abccbe5737d74c3ebb1cb1f28f40545a7331f27b324a not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172299 4651 scope.go:117] "RemoveContainer" containerID="8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172594 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111"} err="failed to get container status \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": rpc error: code = NotFound desc = could not find container \"8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111\": container with ID starting with 8d601a4f6b2bc152f0ce5d902739d0dbce6accb7aa0fe6e1d9479cbbdc259111 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172621 4651 scope.go:117] "RemoveContainer" containerID="3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172859 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5"} err="failed to get container status \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": rpc error: code = NotFound desc = could not find container \"3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5\": container with ID starting with 3d46aec0c295cdf804c3bc4c99e5fb11ebe42f3828dc482ab1b8400ccb6e9fb5 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.172881 4651 scope.go:117] "RemoveContainer" containerID="60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173145 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9"} err="failed to get container status \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": rpc error: code = NotFound desc = could not find container \"60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9\": container with ID starting with 60b16f4f7bb712dd23ff2caccc6022a3f4d4f2b3715b7d638c4fd9729b82d9e9 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173199 4651 scope.go:117] "RemoveContainer" containerID="6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173542 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549"} err="failed to get container status \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": rpc error: code = NotFound desc = could not find container \"6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549\": container with ID starting with 6ab02a949f56e5afbd69a8f1d2982936a892b7cd88947d9bff9b39e75f6a0549 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173562 4651 scope.go:117] "RemoveContainer" containerID="bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173786 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c"} err="failed to get container status \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": rpc error: code = NotFound desc = could not find container \"bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c\": container with ID starting with bbc8d85dc6d7b18c7ab24fef031a0123d1e093e10be4531a563b2ca76fd9545c not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.173814 4651 scope.go:117] "RemoveContainer" containerID="3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.174136 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d"} err="failed to get container status \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": rpc error: code = NotFound desc = could not find container \"3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d\": container with ID starting with 3633c59689b42b1acc811824a4fcc5945d45762cbffdd74672f2d2e843ca6b0d not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.174159 4651 scope.go:117] "RemoveContainer" containerID="81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.174443 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb"} err="failed to get container status \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": rpc error: code = NotFound desc = could not find container \"81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb\": container with ID starting with 81130acad5e815db6bc8c78b6c5cd8ee8f5f1a3d6e09c3d581c9b92c1ae8d1fb not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.174467 4651 scope.go:117] "RemoveContainer" containerID="326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.174812 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8"} err="failed to get container status \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": rpc error: code = NotFound desc = could not find container \"326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8\": container with ID starting with 326cd2f19e03fb4db936ffd14c252d96f6e3f7a3869967d74c61ff7b66bf5de8 not found: ID does not exist" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206623 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-slash\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206685 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206707 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-etc-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206741 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-node-log\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206761 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-netns\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206784 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206853 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-log-socket\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206877 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-config\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206899 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-systemd-units\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206915 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206932 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-netd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206952 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-kubelet\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206968 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-ovn\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.206989 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207004 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-var-lib-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207058 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vglhb\" (UniqueName: \"kubernetes.io/projected/856b60d0-ad5c-4aed-b0cc-81e65533984d-kube-api-access-vglhb\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207073 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-script-lib\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207107 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-bin\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207146 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-env-overrides\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207179 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-systemd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207327 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nm9k\" (UniqueName: \"kubernetes.io/projected/e9ee7939-7a21-4f3a-b534-056415581b10-kube-api-access-5nm9k\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207818 4651 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9ee7939-7a21-4f3a-b534-056415581b10-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.207829 4651 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ee7939-7a21-4f3a-b534-056415581b10-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308694 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-node-log\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308729 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-netns\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308749 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308771 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-log-socket\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308784 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-config\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308805 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-systemd-units\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308823 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308857 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-netd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308883 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-kubelet\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308900 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-ovn\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308922 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308942 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-var-lib-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308967 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vglhb\" (UniqueName: \"kubernetes.io/projected/856b60d0-ad5c-4aed-b0cc-81e65533984d-kube-api-access-vglhb\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308982 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-script-lib\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.308998 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-bin\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309012 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-env-overrides\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309051 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-systemd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309076 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-slash\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309091 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309106 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-etc-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309175 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-etc-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309207 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-node-log\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309227 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-netns\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309247 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309269 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-log-socket\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309732 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-var-lib-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309833 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-config\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309838 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-systemd-units\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309886 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-systemd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309909 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-slash\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.309999 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310101 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-netd\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310182 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-kubelet\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310256 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-ovn\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310306 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-env-overrides\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310328 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-run-openvswitch\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310401 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/856b60d0-ad5c-4aed-b0cc-81e65533984d-host-cni-bin\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.310508 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovnkube-script-lib\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.312398 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/856b60d0-ad5c-4aed-b0cc-81e65533984d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.320222 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mmgnh"] Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.332557 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vglhb\" (UniqueName: \"kubernetes.io/projected/856b60d0-ad5c-4aed-b0cc-81e65533984d-kube-api-access-vglhb\") pod \"ovnkube-node-sjdnj\" (UID: \"856b60d0-ad5c-4aed-b0cc-81e65533984d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.337356 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mmgnh"] Nov 26 14:59:22 crc kubenswrapper[4651]: I1126 14:59:22.627992 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:22.999782 4651 generic.go:334] "Generic (PLEG): container finished" podID="856b60d0-ad5c-4aed-b0cc-81e65533984d" containerID="facd006c22ea843b358d85350c839d2eccb2ff4796cce0abdd31024dd755eaac" exitCode=0 Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:22.999906 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerDied","Data":"facd006c22ea843b358d85350c839d2eccb2ff4796cce0abdd31024dd755eaac"} Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:23.000254 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"6e96a806bdf2c6be94cc35e57f86b2c74a286120d0fa076d145472acfd17fc12"} Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:23.003843 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c884v_88feea33-aa22-45e0-9066-e40e92590ca5/kube-multus/0.log" Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:23.003987 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c884v" event={"ID":"88feea33-aa22-45e0-9066-e40e92590ca5","Type":"ContainerStarted","Data":"5061818b33bb9f358cde7419f5ba80b7a51707dc3f03fb550a1d8d55915e0b6a"} Nov 26 14:59:23 crc kubenswrapper[4651]: I1126 14:59:23.421081 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ee7939-7a21-4f3a-b534-056415581b10" path="/var/lib/kubelet/pods/e9ee7939-7a21-4f3a-b534-056415581b10/volumes" Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019495 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"977b49b2246b6f770ca65077d06ec29a0094e374e07523898a4915b1dc4f1f90"} Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019564 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"53e721c933213ff3cc421e0ccbb885fbe0aeb7a1b28ac50d7da7f643aaccebec"} Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019579 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"5cc6e70c3d1c9537c752e4281ec5fcc3a48ce5ec5261475ff3f4ff450ece5ccd"} Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019590 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"ba46834f8d8c66ca2dee7d0da1da0e01f59b0194a6c092db74973afe752cd587"} Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019601 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"ca5541ea0dcc7545e300e5309acec6b41af0fcf484df5ba75f20c661933600da"} Nov 26 14:59:24 crc kubenswrapper[4651]: I1126 14:59:24.019629 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"a0dbc43114d1d7e20da87b4f19569b0a0e34bcdb9e48aabb3921505a37e6619d"} Nov 26 14:59:26 crc kubenswrapper[4651]: I1126 14:59:26.033425 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"637d4277cf1159886d5e6a89939f4c533d67aa09311938cc694a69579c8126e9"} Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.059280 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" event={"ID":"856b60d0-ad5c-4aed-b0cc-81e65533984d","Type":"ContainerStarted","Data":"fda6baa7fcb4654d9a5d9d9b2e2ecb3071658cefd1567805cc9a31009713a518"} Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.060772 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.060899 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.060933 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.140926 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.140999 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.148174 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" podStartSLOduration=8.148156365 podStartE2EDuration="8.148156365s" podCreationTimestamp="2025-11-26 14:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 14:59:29.144649141 +0000 UTC m=+536.570396755" watchObservedRunningTime="2025-11-26 14:59:29.148156365 +0000 UTC m=+536.573903979" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.149757 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:29 crc kubenswrapper[4651]: I1126 14:59:29.149818 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:52 crc kubenswrapper[4651]: I1126 14:59:52.654678 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjdnj" Nov 26 14:59:59 crc kubenswrapper[4651]: I1126 14:59:59.132598 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 14:59:59 crc kubenswrapper[4651]: I1126 14:59:59.133218 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 14:59:59 crc kubenswrapper[4651]: I1126 14:59:59.133259 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 14:59:59 crc kubenswrapper[4651]: I1126 14:59:59.133840 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 14:59:59 crc kubenswrapper[4651]: I1126 14:59:59.133889 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190" gracePeriod=600 Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.136908 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x"] Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.137892 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.140222 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.140723 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.147681 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x"] Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.234805 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190" exitCode=0 Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.234882 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190"} Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.235278 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a"} Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.235304 4651 scope.go:117] "RemoveContainer" containerID="14324e572c15dd66656d2e3c90434fa5f4abfaec71320df4719abee588df2197" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.285674 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tksk\" (UniqueName: \"kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.285738 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.285769 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.387912 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tksk\" (UniqueName: \"kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.387996 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.388060 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.389128 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.394510 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.411674 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tksk\" (UniqueName: \"kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk\") pod \"collect-profiles-29402820-9fs8x\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.486014 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:00 crc kubenswrapper[4651]: I1126 15:00:00.659996 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x"] Nov 26 15:00:01 crc kubenswrapper[4651]: I1126 15:00:01.247681 4651 generic.go:334] "Generic (PLEG): container finished" podID="406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" containerID="bbc70c041fa11857e481f00870f1d514093f93490f9b67cfb14120b0f1832159" exitCode=0 Nov 26 15:00:01 crc kubenswrapper[4651]: I1126 15:00:01.247790 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" event={"ID":"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25","Type":"ContainerDied","Data":"bbc70c041fa11857e481f00870f1d514093f93490f9b67cfb14120b0f1832159"} Nov 26 15:00:01 crc kubenswrapper[4651]: I1126 15:00:01.247991 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" event={"ID":"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25","Type":"ContainerStarted","Data":"5018967ef0bee8bfc4131fce55017a392cfcd7693afbf1551ba0636a4c3deab9"} Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.499712 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.514135 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tksk\" (UniqueName: \"kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk\") pod \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.514211 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume\") pod \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.514253 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume\") pod \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\" (UID: \"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25\") " Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.516623 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume" (OuterVolumeSpecName: "config-volume") pod "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" (UID: "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.523618 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk" (OuterVolumeSpecName: "kube-api-access-4tksk") pod "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" (UID: "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25"). InnerVolumeSpecName "kube-api-access-4tksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.523872 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" (UID: "406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.615989 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tksk\" (UniqueName: \"kubernetes.io/projected/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-kube-api-access-4tksk\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.616096 4651 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:02 crc kubenswrapper[4651]: I1126 15:00:02.616125 4651 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:03 crc kubenswrapper[4651]: I1126 15:00:03.262386 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" event={"ID":"406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25","Type":"ContainerDied","Data":"5018967ef0bee8bfc4131fce55017a392cfcd7693afbf1551ba0636a4c3deab9"} Nov 26 15:00:03 crc kubenswrapper[4651]: I1126 15:00:03.262429 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5018967ef0bee8bfc4131fce55017a392cfcd7693afbf1551ba0636a4c3deab9" Nov 26 15:00:03 crc kubenswrapper[4651]: I1126 15:00:03.262742 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-9fs8x" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.453512 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9"] Nov 26 15:00:04 crc kubenswrapper[4651]: E1126 15:00:04.454086 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" containerName="collect-profiles" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.454109 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" containerName="collect-profiles" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.454234 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="406f4ac0-f2c1-404d-8d5f-50ef2cc8fb25" containerName="collect-profiles" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.455099 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.457990 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.463796 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9"] Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.637885 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.637943 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k92hg\" (UniqueName: \"kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.638002 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.739783 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.739876 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k92hg\" (UniqueName: \"kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.739944 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.740766 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.740789 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.760276 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k92hg\" (UniqueName: \"kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.770534 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:04 crc kubenswrapper[4651]: I1126 15:00:04.952763 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9"] Nov 26 15:00:05 crc kubenswrapper[4651]: I1126 15:00:05.275448 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerStarted","Data":"6f6609923716eebdea303435fca25939ec16f5ad71357cd9bca2452c05f298e4"} Nov 26 15:00:05 crc kubenswrapper[4651]: I1126 15:00:05.275883 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerStarted","Data":"3468ea04150c64c1796038da084cc4de53273d80f396fd3b69c09e7cad32fa2a"} Nov 26 15:00:06 crc kubenswrapper[4651]: I1126 15:00:06.282105 4651 generic.go:334] "Generic (PLEG): container finished" podID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerID="6f6609923716eebdea303435fca25939ec16f5ad71357cd9bca2452c05f298e4" exitCode=0 Nov 26 15:00:06 crc kubenswrapper[4651]: I1126 15:00:06.282151 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerDied","Data":"6f6609923716eebdea303435fca25939ec16f5ad71357cd9bca2452c05f298e4"} Nov 26 15:00:09 crc kubenswrapper[4651]: I1126 15:00:09.298458 4651 generic.go:334] "Generic (PLEG): container finished" podID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerID="24bcb28c3e4c7e062de906ceb317ba0f34799f2b5813819d7c3fb709db3ccdc7" exitCode=0 Nov 26 15:00:09 crc kubenswrapper[4651]: I1126 15:00:09.298583 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerDied","Data":"24bcb28c3e4c7e062de906ceb317ba0f34799f2b5813819d7c3fb709db3ccdc7"} Nov 26 15:00:10 crc kubenswrapper[4651]: I1126 15:00:10.307402 4651 generic.go:334] "Generic (PLEG): container finished" podID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerID="869b1c9d3aeeaf38234b61b69b308876778d8dd9eff93fa9d5a64007d802378c" exitCode=0 Nov 26 15:00:10 crc kubenswrapper[4651]: I1126 15:00:10.307545 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerDied","Data":"869b1c9d3aeeaf38234b61b69b308876778d8dd9eff93fa9d5a64007d802378c"} Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.617828 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.719788 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k92hg\" (UniqueName: \"kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg\") pod \"83e0d76b-27c5-4f78-8f50-48beac51f214\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.719873 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle\") pod \"83e0d76b-27c5-4f78-8f50-48beac51f214\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.720008 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util\") pod \"83e0d76b-27c5-4f78-8f50-48beac51f214\" (UID: \"83e0d76b-27c5-4f78-8f50-48beac51f214\") " Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.720627 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle" (OuterVolumeSpecName: "bundle") pod "83e0d76b-27c5-4f78-8f50-48beac51f214" (UID: "83e0d76b-27c5-4f78-8f50-48beac51f214"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.726819 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg" (OuterVolumeSpecName: "kube-api-access-k92hg") pod "83e0d76b-27c5-4f78-8f50-48beac51f214" (UID: "83e0d76b-27c5-4f78-8f50-48beac51f214"). InnerVolumeSpecName "kube-api-access-k92hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.732160 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util" (OuterVolumeSpecName: "util") pod "83e0d76b-27c5-4f78-8f50-48beac51f214" (UID: "83e0d76b-27c5-4f78-8f50-48beac51f214"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.822189 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k92hg\" (UniqueName: \"kubernetes.io/projected/83e0d76b-27c5-4f78-8f50-48beac51f214-kube-api-access-k92hg\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.822230 4651 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:11 crc kubenswrapper[4651]: I1126 15:00:11.822243 4651 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/83e0d76b-27c5-4f78-8f50-48beac51f214-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:00:12 crc kubenswrapper[4651]: I1126 15:00:12.319815 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" event={"ID":"83e0d76b-27c5-4f78-8f50-48beac51f214","Type":"ContainerDied","Data":"3468ea04150c64c1796038da084cc4de53273d80f396fd3b69c09e7cad32fa2a"} Nov 26 15:00:12 crc kubenswrapper[4651]: I1126 15:00:12.319862 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3468ea04150c64c1796038da084cc4de53273d80f396fd3b69c09e7cad32fa2a" Nov 26 15:00:12 crc kubenswrapper[4651]: I1126 15:00:12.320223 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.009902 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-g2r5z"] Nov 26 15:00:16 crc kubenswrapper[4651]: E1126 15:00:16.010767 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="extract" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.010782 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="extract" Nov 26 15:00:16 crc kubenswrapper[4651]: E1126 15:00:16.010806 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="pull" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.010813 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="pull" Nov 26 15:00:16 crc kubenswrapper[4651]: E1126 15:00:16.010823 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="util" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.010830 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="util" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.010940 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e0d76b-27c5-4f78-8f50-48beac51f214" containerName="extract" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.011416 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.013674 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.013704 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.013776 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pt6md" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.030332 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-g2r5z"] Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.172779 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4h4\" (UniqueName: \"kubernetes.io/projected/03019cf8-465a-4be9-b1f1-3137424cde1c-kube-api-access-hf4h4\") pod \"nmstate-operator-557fdffb88-g2r5z\" (UID: \"03019cf8-465a-4be9-b1f1-3137424cde1c\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.273673 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4h4\" (UniqueName: \"kubernetes.io/projected/03019cf8-465a-4be9-b1f1-3137424cde1c-kube-api-access-hf4h4\") pod \"nmstate-operator-557fdffb88-g2r5z\" (UID: \"03019cf8-465a-4be9-b1f1-3137424cde1c\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.297791 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4h4\" (UniqueName: \"kubernetes.io/projected/03019cf8-465a-4be9-b1f1-3137424cde1c-kube-api-access-hf4h4\") pod \"nmstate-operator-557fdffb88-g2r5z\" (UID: \"03019cf8-465a-4be9-b1f1-3137424cde1c\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.327427 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" Nov 26 15:00:16 crc kubenswrapper[4651]: I1126 15:00:16.745216 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-g2r5z"] Nov 26 15:00:17 crc kubenswrapper[4651]: I1126 15:00:17.346160 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" event={"ID":"03019cf8-465a-4be9-b1f1-3137424cde1c","Type":"ContainerStarted","Data":"ca2d0d12c4c53882981088afbaa8d696eedd61e76b29d7506d7e4ec8423aadbc"} Nov 26 15:00:21 crc kubenswrapper[4651]: I1126 15:00:21.371240 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" event={"ID":"03019cf8-465a-4be9-b1f1-3137424cde1c","Type":"ContainerStarted","Data":"7fba0bb54386fdbc256230f333037fc5a67b41e8c4121b863df06b88507362ea"} Nov 26 15:00:21 crc kubenswrapper[4651]: I1126 15:00:21.393060 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-g2r5z" podStartSLOduration=2.675731545 podStartE2EDuration="6.393017543s" podCreationTimestamp="2025-11-26 15:00:15 +0000 UTC" firstStartedPulling="2025-11-26 15:00:16.755086903 +0000 UTC m=+584.180834497" lastFinishedPulling="2025-11-26 15:00:20.472372881 +0000 UTC m=+587.898120495" observedRunningTime="2025-11-26 15:00:21.39070752 +0000 UTC m=+588.816455124" watchObservedRunningTime="2025-11-26 15:00:21.393017543 +0000 UTC m=+588.818765157" Nov 26 15:00:24 crc kubenswrapper[4651]: I1126 15:00:24.945584 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj"] Nov 26 15:00:24 crc kubenswrapper[4651]: I1126 15:00:24.959539 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" Nov 26 15:00:24 crc kubenswrapper[4651]: I1126 15:00:24.962141 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2hmsq" Nov 26 15:00:24 crc kubenswrapper[4651]: I1126 15:00:24.971814 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.009216 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fz8kc"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.009916 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.016620 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.017272 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.019268 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.083009 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.087767 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrdg\" (UniqueName: \"kubernetes.io/projected/fbfcbea7-d7be-4196-9146-159b5fdc8afa-kube-api-access-5lrdg\") pod \"nmstate-metrics-5dcf9c57c5-kl9fj\" (UID: \"fbfcbea7-d7be-4196-9146-159b5fdc8afa\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.158782 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.159610 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: W1126 15:00:25.161465 4651 reflector.go:561] object-"openshift-nmstate"/"plugin-serving-cert": failed to list *v1.Secret: secrets "plugin-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Nov 26 15:00:25 crc kubenswrapper[4651]: E1126 15:00:25.161535 4651 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"plugin-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"plugin-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.161876 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-f4vf8" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.162105 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.175230 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.188977 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcsrk\" (UniqueName: \"kubernetes.io/projected/66a988fd-9365-4310-ae73-c26d1da23d30-kube-api-access-jcsrk\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189058 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-ovs-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189083 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrdg\" (UniqueName: \"kubernetes.io/projected/fbfcbea7-d7be-4196-9146-159b5fdc8afa-kube-api-access-5lrdg\") pod \"nmstate-metrics-5dcf9c57c5-kl9fj\" (UID: \"fbfcbea7-d7be-4196-9146-159b5fdc8afa\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189101 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-nmstate-lock\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189122 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-dbus-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.189145 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbcl\" (UniqueName: \"kubernetes.io/projected/8e7b8425-38cf-4ff3-967d-18555db204a9-kube-api-access-xgbcl\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.208836 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrdg\" (UniqueName: \"kubernetes.io/projected/fbfcbea7-d7be-4196-9146-159b5fdc8afa-kube-api-access-5lrdg\") pod \"nmstate-metrics-5dcf9c57c5-kl9fj\" (UID: \"fbfcbea7-d7be-4196-9146-159b5fdc8afa\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.285950 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290658 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290707 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcsrk\" (UniqueName: \"kubernetes.io/projected/66a988fd-9365-4310-ae73-c26d1da23d30-kube-api-access-jcsrk\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290736 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25w86\" (UniqueName: \"kubernetes.io/projected/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-kube-api-access-25w86\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290786 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-ovs-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290814 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290835 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290857 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-nmstate-lock\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: E1126 15:00:25.290880 4651 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 26 15:00:25 crc kubenswrapper[4651]: E1126 15:00:25.290965 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair podName:8e7b8425-38cf-4ff3-967d-18555db204a9 nodeName:}" failed. No retries permitted until 2025-11-26 15:00:25.790937183 +0000 UTC m=+593.216684817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair") pod "nmstate-webhook-6b89b748d8-7lwbk" (UID: "8e7b8425-38cf-4ff3-967d-18555db204a9") : secret "openshift-nmstate-webhook" not found Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.291142 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-dbus-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.290889 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-dbus-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.291334 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbcl\" (UniqueName: \"kubernetes.io/projected/8e7b8425-38cf-4ff3-967d-18555db204a9-kube-api-access-xgbcl\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.291370 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-ovs-socket\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.291404 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/66a988fd-9365-4310-ae73-c26d1da23d30-nmstate-lock\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.320734 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcsrk\" (UniqueName: \"kubernetes.io/projected/66a988fd-9365-4310-ae73-c26d1da23d30-kube-api-access-jcsrk\") pod \"nmstate-handler-fz8kc\" (UID: \"66a988fd-9365-4310-ae73-c26d1da23d30\") " pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.327710 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.333348 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbcl\" (UniqueName: \"kubernetes.io/projected/8e7b8425-38cf-4ff3-967d-18555db204a9-kube-api-access-xgbcl\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.392317 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25w86\" (UniqueName: \"kubernetes.io/projected/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-kube-api-access-25w86\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.392737 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.392766 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.393861 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.416517 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fz8kc" event={"ID":"66a988fd-9365-4310-ae73-c26d1da23d30","Type":"ContainerStarted","Data":"243ded51ca857062d9d14c5478356b9bf873c4a86b2d11a053bb0d3ced028264"} Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.416560 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-65665976d6-2nsxb"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.417367 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.424330 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65665976d6-2nsxb"] Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.428258 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25w86\" (UniqueName: \"kubernetes.io/projected/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-kube-api-access-25w86\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602276 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-service-ca\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602338 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45btc\" (UniqueName: \"kubernetes.io/projected/41618db6-1a19-45f0-832f-900d3f6e744a-kube-api-access-45btc\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602429 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-console-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602463 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-trusted-ca-bundle\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602561 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-oauth-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602591 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.602611 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-oauth-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703677 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-service-ca\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703750 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45btc\" (UniqueName: \"kubernetes.io/projected/41618db6-1a19-45f0-832f-900d3f6e744a-kube-api-access-45btc\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703795 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-console-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703819 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-trusted-ca-bundle\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703886 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-oauth-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703913 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.703935 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-oauth-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.705107 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-console-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.705108 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-oauth-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.705200 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-service-ca\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.705480 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41618db6-1a19-45f0-832f-900d3f6e744a-trusted-ca-bundle\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.707753 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-oauth-config\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.709820 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/41618db6-1a19-45f0-832f-900d3f6e744a-console-serving-cert\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.723331 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45btc\" (UniqueName: \"kubernetes.io/projected/41618db6-1a19-45f0-832f-900d3f6e744a-kube-api-access-45btc\") pod \"console-65665976d6-2nsxb\" (UID: \"41618db6-1a19-45f0-832f-900d3f6e744a\") " pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.755380 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.805604 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.808746 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8e7b8425-38cf-4ff3-967d-18555db204a9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7lwbk\" (UID: \"8e7b8425-38cf-4ff3-967d-18555db204a9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.820219 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj"] Nov 26 15:00:25 crc kubenswrapper[4651]: W1126 15:00:25.828735 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbfcbea7_d7be_4196_9146_159b5fdc8afa.slice/crio-ac44c48946ba1aa432613cf884cac98c8d9ea374e79f93dd868bbb8bd55be36c WatchSource:0}: Error finding container ac44c48946ba1aa432613cf884cac98c8d9ea374e79f93dd868bbb8bd55be36c: Status 404 returned error can't find the container with id ac44c48946ba1aa432613cf884cac98c8d9ea374e79f93dd868bbb8bd55be36c Nov 26 15:00:25 crc kubenswrapper[4651]: I1126 15:00:25.937374 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.019047 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65665976d6-2nsxb"] Nov 26 15:00:26 crc kubenswrapper[4651]: W1126 15:00:26.027453 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41618db6_1a19_45f0_832f_900d3f6e744a.slice/crio-9e7e0f4dbd8640617e584d067ea304c0599fea36e846f303940d3ec394dcae20 WatchSource:0}: Error finding container 9e7e0f4dbd8640617e584d067ea304c0599fea36e846f303940d3ec394dcae20: Status 404 returned error can't find the container with id 9e7e0f4dbd8640617e584d067ea304c0599fea36e846f303940d3ec394dcae20 Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.165716 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk"] Nov 26 15:00:26 crc kubenswrapper[4651]: W1126 15:00:26.175621 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7b8425_38cf_4ff3_967d_18555db204a9.slice/crio-05227dcb96fb47c135d23f7ca9f0d032e12b5787ddd9f925edf9faac784fe847 WatchSource:0}: Error finding container 05227dcb96fb47c135d23f7ca9f0d032e12b5787ddd9f925edf9faac784fe847: Status 404 returned error can't find the container with id 05227dcb96fb47c135d23f7ca9f0d032e12b5787ddd9f925edf9faac784fe847 Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.273448 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.279453 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea25780c-5944-4cbf-a8f0-e1e3dd4617f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-lnrjj\" (UID: \"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.379299 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.419007 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65665976d6-2nsxb" event={"ID":"41618db6-1a19-45f0-832f-900d3f6e744a","Type":"ContainerStarted","Data":"17071123b7413094159ff551794cd14852dc7c2562166db7f12ed2480701c3c2"} Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.419065 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65665976d6-2nsxb" event={"ID":"41618db6-1a19-45f0-832f-900d3f6e744a","Type":"ContainerStarted","Data":"9e7e0f4dbd8640617e584d067ea304c0599fea36e846f303940d3ec394dcae20"} Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.421540 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" event={"ID":"fbfcbea7-d7be-4196-9146-159b5fdc8afa","Type":"ContainerStarted","Data":"ac44c48946ba1aa432613cf884cac98c8d9ea374e79f93dd868bbb8bd55be36c"} Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.424257 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" event={"ID":"8e7b8425-38cf-4ff3-967d-18555db204a9","Type":"ContainerStarted","Data":"05227dcb96fb47c135d23f7ca9f0d032e12b5787ddd9f925edf9faac784fe847"} Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.436425 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65665976d6-2nsxb" podStartSLOduration=1.436393744 podStartE2EDuration="1.436393744s" podCreationTimestamp="2025-11-26 15:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:00:26.436340443 +0000 UTC m=+593.862088067" watchObservedRunningTime="2025-11-26 15:00:26.436393744 +0000 UTC m=+593.862141348" Nov 26 15:00:26 crc kubenswrapper[4651]: I1126 15:00:26.829889 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj"] Nov 26 15:00:26 crc kubenswrapper[4651]: W1126 15:00:26.885419 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea25780c_5944_4cbf_a8f0_e1e3dd4617f7.slice/crio-164c8fef27eba085a1a5f4f8cef2f7c14eca250cbc456713d0491c9e5c4ace24 WatchSource:0}: Error finding container 164c8fef27eba085a1a5f4f8cef2f7c14eca250cbc456713d0491c9e5c4ace24: Status 404 returned error can't find the container with id 164c8fef27eba085a1a5f4f8cef2f7c14eca250cbc456713d0491c9e5c4ace24 Nov 26 15:00:27 crc kubenswrapper[4651]: I1126 15:00:27.440415 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" event={"ID":"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7","Type":"ContainerStarted","Data":"164c8fef27eba085a1a5f4f8cef2f7c14eca250cbc456713d0491c9e5c4ace24"} Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.454162 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fz8kc" event={"ID":"66a988fd-9365-4310-ae73-c26d1da23d30","Type":"ContainerStarted","Data":"75e2b00fad13ed8bb8bf3888de972f4baac5c132960f7bf6b017b3e641c88f8c"} Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.454849 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.459073 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" event={"ID":"fbfcbea7-d7be-4196-9146-159b5fdc8afa","Type":"ContainerStarted","Data":"3f79f654fb8971ea01e89fa14adfe63477d0d40886578657bbdba9f0f4d3e6a4"} Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.460406 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" event={"ID":"8e7b8425-38cf-4ff3-967d-18555db204a9","Type":"ContainerStarted","Data":"7961eeb37f7f6cbdad3e1cd7379191cc62372de8c5ee05fe2f3a4af370f8033e"} Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.460562 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.470685 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fz8kc" podStartSLOduration=1.960015427 podStartE2EDuration="5.470667818s" podCreationTimestamp="2025-11-26 15:00:24 +0000 UTC" firstStartedPulling="2025-11-26 15:00:25.369230059 +0000 UTC m=+592.794977663" lastFinishedPulling="2025-11-26 15:00:28.87988245 +0000 UTC m=+596.305630054" observedRunningTime="2025-11-26 15:00:29.469504987 +0000 UTC m=+596.895252611" watchObservedRunningTime="2025-11-26 15:00:29.470667818 +0000 UTC m=+596.896415422" Nov 26 15:00:29 crc kubenswrapper[4651]: I1126 15:00:29.486085 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" podStartSLOduration=2.782137355 podStartE2EDuration="5.486068516s" podCreationTimestamp="2025-11-26 15:00:24 +0000 UTC" firstStartedPulling="2025-11-26 15:00:26.178994226 +0000 UTC m=+593.604741830" lastFinishedPulling="2025-11-26 15:00:28.882925387 +0000 UTC m=+596.308672991" observedRunningTime="2025-11-26 15:00:29.485289522 +0000 UTC m=+596.911037146" watchObservedRunningTime="2025-11-26 15:00:29.486068516 +0000 UTC m=+596.911816130" Nov 26 15:00:30 crc kubenswrapper[4651]: I1126 15:00:30.466690 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" event={"ID":"ea25780c-5944-4cbf-a8f0-e1e3dd4617f7","Type":"ContainerStarted","Data":"492b90ee9e354c417ecb920155de78bb5df162f850ab7b24f462e92e59d40480"} Nov 26 15:00:32 crc kubenswrapper[4651]: I1126 15:00:32.482307 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" event={"ID":"fbfcbea7-d7be-4196-9146-159b5fdc8afa","Type":"ContainerStarted","Data":"d18ea5ee2d369a087a6b4a8fd93f19eadcde2b69932dae397b7005c2f2805e90"} Nov 26 15:00:32 crc kubenswrapper[4651]: I1126 15:00:32.499943 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-kl9fj" podStartSLOduration=2.5044342569999998 podStartE2EDuration="8.499926289s" podCreationTimestamp="2025-11-26 15:00:24 +0000 UTC" firstStartedPulling="2025-11-26 15:00:25.831264837 +0000 UTC m=+593.257012441" lastFinishedPulling="2025-11-26 15:00:31.826756869 +0000 UTC m=+599.252504473" observedRunningTime="2025-11-26 15:00:32.496468664 +0000 UTC m=+599.922216278" watchObservedRunningTime="2025-11-26 15:00:32.499926289 +0000 UTC m=+599.925673883" Nov 26 15:00:32 crc kubenswrapper[4651]: I1126 15:00:32.500213 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-lnrjj" podStartSLOduration=4.555298664 podStartE2EDuration="7.500208514s" podCreationTimestamp="2025-11-26 15:00:25 +0000 UTC" firstStartedPulling="2025-11-26 15:00:26.887405527 +0000 UTC m=+594.313153131" lastFinishedPulling="2025-11-26 15:00:29.832315377 +0000 UTC m=+597.258062981" observedRunningTime="2025-11-26 15:00:30.481648131 +0000 UTC m=+597.907395745" watchObservedRunningTime="2025-11-26 15:00:32.500208514 +0000 UTC m=+599.925956118" Nov 26 15:00:35 crc kubenswrapper[4651]: I1126 15:00:35.350258 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fz8kc" Nov 26 15:00:35 crc kubenswrapper[4651]: I1126 15:00:35.756566 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:35 crc kubenswrapper[4651]: I1126 15:00:35.756963 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:35 crc kubenswrapper[4651]: I1126 15:00:35.760712 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:36 crc kubenswrapper[4651]: I1126 15:00:36.504577 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65665976d6-2nsxb" Nov 26 15:00:36 crc kubenswrapper[4651]: I1126 15:00:36.543373 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 15:00:45 crc kubenswrapper[4651]: I1126 15:00:45.945341 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7lwbk" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.227664 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4"] Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.229540 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.232449 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.242312 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4"] Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.365617 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.365683 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588z6\" (UniqueName: \"kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.365809 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.467025 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.467082 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588z6\" (UniqueName: \"kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.467120 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.467548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.467634 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.492631 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588z6\" (UniqueName: \"kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.547238 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:00:59 crc kubenswrapper[4651]: I1126 15:00:59.997325 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4"] Nov 26 15:01:00 crc kubenswrapper[4651]: I1126 15:01:00.629932 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca827f37-4b80-4699-91a5-8074b68a628c" containerID="6b7d7b95a949e556cc4726de640a896915af1a2d0a37826f5a76ede59ef51da5" exitCode=0 Nov 26 15:01:00 crc kubenswrapper[4651]: I1126 15:01:00.629991 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" event={"ID":"ca827f37-4b80-4699-91a5-8074b68a628c","Type":"ContainerDied","Data":"6b7d7b95a949e556cc4726de640a896915af1a2d0a37826f5a76ede59ef51da5"} Nov 26 15:01:00 crc kubenswrapper[4651]: I1126 15:01:00.630072 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" event={"ID":"ca827f37-4b80-4699-91a5-8074b68a628c","Type":"ContainerStarted","Data":"6acc3d1557922c758be8d4a79298e0517d05da6a5f65319d2f836ee83778158c"} Nov 26 15:01:01 crc kubenswrapper[4651]: I1126 15:01:01.594489 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q4qzb" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" containerID="cri-o://0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5" gracePeriod=15 Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.221984 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q4qzb_74cd140b-bb74-4152-bb6f-0a42f92c864e/console/0.log" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.222433 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331602 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331666 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331711 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331747 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpr2f\" (UniqueName: \"kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331783 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331892 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.331940 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca\") pod \"74cd140b-bb74-4152-bb6f-0a42f92c864e\" (UID: \"74cd140b-bb74-4152-bb6f-0a42f92c864e\") " Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.332387 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.332661 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca" (OuterVolumeSpecName: "service-ca") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.333397 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config" (OuterVolumeSpecName: "console-config") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.333682 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.338266 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f" (OuterVolumeSpecName: "kube-api-access-hpr2f") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "kube-api-access-hpr2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.338521 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.339141 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "74cd140b-bb74-4152-bb6f-0a42f92c864e" (UID: "74cd140b-bb74-4152-bb6f-0a42f92c864e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433559 4651 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433600 4651 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433612 4651 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433624 4651 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433639 4651 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74cd140b-bb74-4152-bb6f-0a42f92c864e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433650 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpr2f\" (UniqueName: \"kubernetes.io/projected/74cd140b-bb74-4152-bb6f-0a42f92c864e-kube-api-access-hpr2f\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.433663 4651 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74cd140b-bb74-4152-bb6f-0a42f92c864e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.645031 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca827f37-4b80-4699-91a5-8074b68a628c" containerID="898c3ed9fcf1d1e88cae61661d3a00d1717bd030423580dcc54083cdb3acc9a0" exitCode=0 Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.645141 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" event={"ID":"ca827f37-4b80-4699-91a5-8074b68a628c","Type":"ContainerDied","Data":"898c3ed9fcf1d1e88cae61661d3a00d1717bd030423580dcc54083cdb3acc9a0"} Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.657976 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q4qzb_74cd140b-bb74-4152-bb6f-0a42f92c864e/console/0.log" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.658379 4651 generic.go:334] "Generic (PLEG): container finished" podID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerID="0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5" exitCode=2 Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.658420 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q4qzb" event={"ID":"74cd140b-bb74-4152-bb6f-0a42f92c864e","Type":"ContainerDied","Data":"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5"} Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.658451 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q4qzb" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.658473 4651 scope.go:117] "RemoveContainer" containerID="0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.658455 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q4qzb" event={"ID":"74cd140b-bb74-4152-bb6f-0a42f92c864e","Type":"ContainerDied","Data":"b16eec6816a403c6deb6edf40af131b29cdec029d85256ce0b42d77bb49e5867"} Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.690147 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.694697 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q4qzb"] Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.703349 4651 scope.go:117] "RemoveContainer" containerID="0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5" Nov 26 15:01:02 crc kubenswrapper[4651]: E1126 15:01:02.705840 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5\": container with ID starting with 0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5 not found: ID does not exist" containerID="0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5" Nov 26 15:01:02 crc kubenswrapper[4651]: I1126 15:01:02.705875 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5"} err="failed to get container status \"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5\": rpc error: code = NotFound desc = could not find container \"0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5\": container with ID starting with 0b6d7e73c05d78c100e60acc9fea13d5859d31141fe4821803218f70c76a60e5 not found: ID does not exist" Nov 26 15:01:03 crc kubenswrapper[4651]: I1126 15:01:03.409459 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" path="/var/lib/kubelet/pods/74cd140b-bb74-4152-bb6f-0a42f92c864e/volumes" Nov 26 15:01:03 crc kubenswrapper[4651]: I1126 15:01:03.665381 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca827f37-4b80-4699-91a5-8074b68a628c" containerID="dba2b45d316d1a6b08285452f1222896c7dce97a0daf396a0b0c7ec4cefbee95" exitCode=0 Nov 26 15:01:03 crc kubenswrapper[4651]: I1126 15:01:03.665446 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" event={"ID":"ca827f37-4b80-4699-91a5-8074b68a628c","Type":"ContainerDied","Data":"dba2b45d316d1a6b08285452f1222896c7dce97a0daf396a0b0c7ec4cefbee95"} Nov 26 15:01:04 crc kubenswrapper[4651]: I1126 15:01:04.888937 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.068288 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle\") pod \"ca827f37-4b80-4699-91a5-8074b68a628c\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.068378 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util\") pod \"ca827f37-4b80-4699-91a5-8074b68a628c\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.068465 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588z6\" (UniqueName: \"kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6\") pod \"ca827f37-4b80-4699-91a5-8074b68a628c\" (UID: \"ca827f37-4b80-4699-91a5-8074b68a628c\") " Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.069387 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle" (OuterVolumeSpecName: "bundle") pod "ca827f37-4b80-4699-91a5-8074b68a628c" (UID: "ca827f37-4b80-4699-91a5-8074b68a628c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.086541 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util" (OuterVolumeSpecName: "util") pod "ca827f37-4b80-4699-91a5-8074b68a628c" (UID: "ca827f37-4b80-4699-91a5-8074b68a628c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.087179 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6" (OuterVolumeSpecName: "kube-api-access-588z6") pod "ca827f37-4b80-4699-91a5-8074b68a628c" (UID: "ca827f37-4b80-4699-91a5-8074b68a628c"). InnerVolumeSpecName "kube-api-access-588z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.169780 4651 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.169817 4651 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca827f37-4b80-4699-91a5-8074b68a628c-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.169829 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588z6\" (UniqueName: \"kubernetes.io/projected/ca827f37-4b80-4699-91a5-8074b68a628c-kube-api-access-588z6\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.680821 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" event={"ID":"ca827f37-4b80-4699-91a5-8074b68a628c","Type":"ContainerDied","Data":"6acc3d1557922c758be8d4a79298e0517d05da6a5f65319d2f836ee83778158c"} Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.680854 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6acc3d1557922c758be8d4a79298e0517d05da6a5f65319d2f836ee83778158c" Nov 26 15:01:05 crc kubenswrapper[4651]: I1126 15:01:05.680881 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.507958 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.508774 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerName="controller-manager" containerID="cri-o://66683b0bb1d1c80326bec688842c05f41bf6b0b90809b93a8a36e3fe4b058e2d" gracePeriod=30 Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.562929 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.563603 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" containerName="route-controller-manager" containerID="cri-o://9d72f5f8abcf53d078b82752e80535c3b233eef917918beb42481570bbed7650" gracePeriod=30 Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.734315 4651 generic.go:334] "Generic (PLEG): container finished" podID="15df1010-c6ea-4bca-9a97-e6659866310f" containerID="9d72f5f8abcf53d078b82752e80535c3b233eef917918beb42481570bbed7650" exitCode=0 Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.734365 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" event={"ID":"15df1010-c6ea-4bca-9a97-e6659866310f","Type":"ContainerDied","Data":"9d72f5f8abcf53d078b82752e80535c3b233eef917918beb42481570bbed7650"} Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.736685 4651 generic.go:334] "Generic (PLEG): container finished" podID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerID="66683b0bb1d1c80326bec688842c05f41bf6b0b90809b93a8a36e3fe4b058e2d" exitCode=0 Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.736725 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" event={"ID":"916a34e5-fa74-4e59-9deb-18a4067f007b","Type":"ContainerDied","Data":"66683b0bb1d1c80326bec688842c05f41bf6b0b90809b93a8a36e3fe4b058e2d"} Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.760352 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh"] Nov 26 15:01:15 crc kubenswrapper[4651]: E1126 15:01:15.760894 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="util" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.760975 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="util" Nov 26 15:01:15 crc kubenswrapper[4651]: E1126 15:01:15.761029 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="extract" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.761101 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="extract" Nov 26 15:01:15 crc kubenswrapper[4651]: E1126 15:01:15.761174 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.761222 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" Nov 26 15:01:15 crc kubenswrapper[4651]: E1126 15:01:15.761299 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="pull" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.761370 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="pull" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.761545 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cd140b-bb74-4152-bb6f-0a42f92c864e" containerName="console" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.761625 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca827f37-4b80-4699-91a5-8074b68a628c" containerName="extract" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.762138 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.764267 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.765135 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.765156 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.768131 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bh8br" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.768231 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.785754 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh"] Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.891296 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvvw\" (UniqueName: \"kubernetes.io/projected/f688796e-89d5-4da8-8dc7-786c5940b853-kube-api-access-2gvvw\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.891381 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-apiservice-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.891405 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-webhook-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.992704 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvvw\" (UniqueName: \"kubernetes.io/projected/f688796e-89d5-4da8-8dc7-786c5940b853-kube-api-access-2gvvw\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.992774 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-apiservice-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:15 crc kubenswrapper[4651]: I1126 15:01:15.992796 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-webhook-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.005246 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-webhook-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.031222 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f688796e-89d5-4da8-8dc7-786c5940b853-apiservice-cert\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.086105 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvvw\" (UniqueName: \"kubernetes.io/projected/f688796e-89d5-4da8-8dc7-786c5940b853-kube-api-access-2gvvw\") pod \"metallb-operator-controller-manager-5b5d786cf6-wsrgh\" (UID: \"f688796e-89d5-4da8-8dc7-786c5940b853\") " pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.104141 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.245335 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.250383 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw"] Nov 26 15:01:16 crc kubenswrapper[4651]: E1126 15:01:16.250593 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerName="controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.250615 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerName="controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: E1126 15:01:16.250638 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" containerName="route-controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.250649 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" containerName="route-controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.250766 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" containerName="route-controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.250788 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" containerName="controller-manager" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.251183 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.255453 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.255611 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.257777 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hn64j" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.290181 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.295574 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64lj6\" (UniqueName: \"kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6\") pod \"15df1010-c6ea-4bca-9a97-e6659866310f\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.295629 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca\") pod \"15df1010-c6ea-4bca-9a97-e6659866310f\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.295678 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert\") pod \"15df1010-c6ea-4bca-9a97-e6659866310f\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.295719 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config\") pod \"15df1010-c6ea-4bca-9a97-e6659866310f\" (UID: \"15df1010-c6ea-4bca-9a97-e6659866310f\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.299721 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config" (OuterVolumeSpecName: "config") pod "15df1010-c6ea-4bca-9a97-e6659866310f" (UID: "15df1010-c6ea-4bca-9a97-e6659866310f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.304507 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6" (OuterVolumeSpecName: "kube-api-access-64lj6") pod "15df1010-c6ea-4bca-9a97-e6659866310f" (UID: "15df1010-c6ea-4bca-9a97-e6659866310f"). InnerVolumeSpecName "kube-api-access-64lj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.304862 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca" (OuterVolumeSpecName: "client-ca") pod "15df1010-c6ea-4bca-9a97-e6659866310f" (UID: "15df1010-c6ea-4bca-9a97-e6659866310f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.314462 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "15df1010-c6ea-4bca-9a97-e6659866310f" (UID: "15df1010-c6ea-4bca-9a97-e6659866310f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.378696 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397495 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config\") pod \"916a34e5-fa74-4e59-9deb-18a4067f007b\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397549 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles\") pod \"916a34e5-fa74-4e59-9deb-18a4067f007b\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397600 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert\") pod \"916a34e5-fa74-4e59-9deb-18a4067f007b\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397622 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85btv\" (UniqueName: \"kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv\") pod \"916a34e5-fa74-4e59-9deb-18a4067f007b\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397685 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca\") pod \"916a34e5-fa74-4e59-9deb-18a4067f007b\" (UID: \"916a34e5-fa74-4e59-9deb-18a4067f007b\") " Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397852 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tkwk\" (UniqueName: \"kubernetes.io/projected/f1830fea-fcaa-4159-a4c9-20787b409237-kube-api-access-8tkwk\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397890 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-webhook-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397927 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-apiservice-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397976 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64lj6\" (UniqueName: \"kubernetes.io/projected/15df1010-c6ea-4bca-9a97-e6659866310f-kube-api-access-64lj6\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397987 4651 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.397995 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15df1010-c6ea-4bca-9a97-e6659866310f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.398003 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15df1010-c6ea-4bca-9a97-e6659866310f-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.398350 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "916a34e5-fa74-4e59-9deb-18a4067f007b" (UID: "916a34e5-fa74-4e59-9deb-18a4067f007b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.398629 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca" (OuterVolumeSpecName: "client-ca") pod "916a34e5-fa74-4e59-9deb-18a4067f007b" (UID: "916a34e5-fa74-4e59-9deb-18a4067f007b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.399580 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config" (OuterVolumeSpecName: "config") pod "916a34e5-fa74-4e59-9deb-18a4067f007b" (UID: "916a34e5-fa74-4e59-9deb-18a4067f007b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.401851 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "916a34e5-fa74-4e59-9deb-18a4067f007b" (UID: "916a34e5-fa74-4e59-9deb-18a4067f007b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.405648 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv" (OuterVolumeSpecName: "kube-api-access-85btv") pod "916a34e5-fa74-4e59-9deb-18a4067f007b" (UID: "916a34e5-fa74-4e59-9deb-18a4067f007b"). InnerVolumeSpecName "kube-api-access-85btv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.498826 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tkwk\" (UniqueName: \"kubernetes.io/projected/f1830fea-fcaa-4159-a4c9-20787b409237-kube-api-access-8tkwk\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499148 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-webhook-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499189 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-apiservice-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499276 4651 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499289 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499298 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/916a34e5-fa74-4e59-9deb-18a4067f007b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499307 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916a34e5-fa74-4e59-9deb-18a4067f007b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.499317 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85btv\" (UniqueName: \"kubernetes.io/projected/916a34e5-fa74-4e59-9deb-18a4067f007b-kube-api-access-85btv\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.504643 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-apiservice-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.505117 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1830fea-fcaa-4159-a4c9-20787b409237-webhook-cert\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.523855 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tkwk\" (UniqueName: \"kubernetes.io/projected/f1830fea-fcaa-4159-a4c9-20787b409237-kube-api-access-8tkwk\") pod \"metallb-operator-webhook-server-7d67bf6468-cdmmw\" (UID: \"f1830fea-fcaa-4159-a4c9-20787b409237\") " pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.566447 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.569916 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.570610 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.623532 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.712593 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.712664 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.712707 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.712735 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.712757 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plgfg\" (UniqueName: \"kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.757156 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" event={"ID":"15df1010-c6ea-4bca-9a97-e6659866310f","Type":"ContainerDied","Data":"e4f9cfcf620b05eebd7cbf3dd217fb0b84407b7060a74d448388dd7c1c0baa2e"} Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.757224 4651 scope.go:117] "RemoveContainer" containerID="9d72f5f8abcf53d078b82752e80535c3b233eef917918beb42481570bbed7650" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.757342 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.771969 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" event={"ID":"916a34e5-fa74-4e59-9deb-18a4067f007b","Type":"ContainerDied","Data":"f572855ef1a897e7dfef9eb7a6a60dfbcb16ce607f7a08e0473e96ce186e3ac1"} Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.772060 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fzwc6" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.813765 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.813823 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.813843 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plgfg\" (UniqueName: \"kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.813867 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.815313 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.816242 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.817212 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.817360 4651 scope.go:117] "RemoveContainer" containerID="66683b0bb1d1c80326bec688842c05f41bf6b0b90809b93a8a36e3fe4b058e2d" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.822542 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.838901 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.844200 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.849596 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fzwc6"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.862756 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plgfg\" (UniqueName: \"kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg\") pod \"controller-manager-5b6d4956-bfp27\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.887584 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.891891 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b8wlj"] Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.900920 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:16 crc kubenswrapper[4651]: I1126 15:01:16.992093 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh"] Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.150181 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw"] Nov 26 15:01:17 crc kubenswrapper[4651]: W1126 15:01:17.161463 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1830fea_fcaa_4159_a4c9_20787b409237.slice/crio-b46c7f8502511d25f050f81c3fe3856028f3cc5ca09b270ade51fd2a976e0eb4 WatchSource:0}: Error finding container b46c7f8502511d25f050f81c3fe3856028f3cc5ca09b270ade51fd2a976e0eb4: Status 404 returned error can't find the container with id b46c7f8502511d25f050f81c3fe3856028f3cc5ca09b270ade51fd2a976e0eb4 Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.409670 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15df1010-c6ea-4bca-9a97-e6659866310f" path="/var/lib/kubelet/pods/15df1010-c6ea-4bca-9a97-e6659866310f/volumes" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.410817 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916a34e5-fa74-4e59-9deb-18a4067f007b" path="/var/lib/kubelet/pods/916a34e5-fa74-4e59-9deb-18a4067f007b/volumes" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.426220 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:17 crc kubenswrapper[4651]: W1126 15:01:17.430857 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd64d8b5_cf4f_4cc2_b36a_1b235bb307d7.slice/crio-3248c240d884aedb7d69c3b5d09904bd38ee7ca0a5ed480c95600d1f7f0d56b7 WatchSource:0}: Error finding container 3248c240d884aedb7d69c3b5d09904bd38ee7ca0a5ed480c95600d1f7f0d56b7: Status 404 returned error can't find the container with id 3248c240d884aedb7d69c3b5d09904bd38ee7ca0a5ed480c95600d1f7f0d56b7 Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.513278 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.597088 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc"] Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.599134 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.604140 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.604416 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.604457 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.609350 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.609424 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.609501 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.617664 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc"] Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.727290 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-client-ca\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.727446 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-config\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.727529 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ed5125-11a3-47f7-8646-992b8fb913c0-serving-cert\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.727550 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2fcd\" (UniqueName: \"kubernetes.io/projected/d4ed5125-11a3-47f7-8646-992b8fb913c0-kube-api-access-k2fcd\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.777782 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerName="controller-manager" containerID="cri-o://ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066" gracePeriod=30 Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.777997 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" event={"ID":"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7","Type":"ContainerStarted","Data":"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066"} Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.778021 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" event={"ID":"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7","Type":"ContainerStarted","Data":"3248c240d884aedb7d69c3b5d09904bd38ee7ca0a5ed480c95600d1f7f0d56b7"} Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.778379 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.779416 4651 patch_prober.go:28] interesting pod/controller-manager-5b6d4956-bfp27 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.779446 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.784660 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerStarted","Data":"98d521d28d0017c2e88182af1a521faee9f70b2961f1c2cef06897193c40eae7"} Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.786104 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" event={"ID":"f1830fea-fcaa-4159-a4c9-20787b409237","Type":"ContainerStarted","Data":"b46c7f8502511d25f050f81c3fe3856028f3cc5ca09b270ade51fd2a976e0eb4"} Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.812271 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" podStartSLOduration=2.81225801 podStartE2EDuration="2.81225801s" podCreationTimestamp="2025-11-26 15:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:17.810497382 +0000 UTC m=+645.236244986" watchObservedRunningTime="2025-11-26 15:01:17.81225801 +0000 UTC m=+645.238005604" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.829136 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-config\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.829220 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ed5125-11a3-47f7-8646-992b8fb913c0-serving-cert\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.829241 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2fcd\" (UniqueName: \"kubernetes.io/projected/d4ed5125-11a3-47f7-8646-992b8fb913c0-kube-api-access-k2fcd\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.829266 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-client-ca\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.830165 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-client-ca\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.830878 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ed5125-11a3-47f7-8646-992b8fb913c0-config\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.835875 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ed5125-11a3-47f7-8646-992b8fb913c0-serving-cert\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.856182 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2fcd\" (UniqueName: \"kubernetes.io/projected/d4ed5125-11a3-47f7-8646-992b8fb913c0-kube-api-access-k2fcd\") pod \"route-controller-manager-cdc848476-brbpc\" (UID: \"d4ed5125-11a3-47f7-8646-992b8fb913c0\") " pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:17 crc kubenswrapper[4651]: I1126 15:01:17.915621 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.189864 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-5b6d4956-bfp27_bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7/controller-manager/0.log" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.190191 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.245930 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc"] Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.338412 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plgfg\" (UniqueName: \"kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg\") pod \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.338505 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config\") pod \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.338568 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert\") pod \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.338614 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca\") pod \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.338644 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles\") pod \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\" (UID: \"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7\") " Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.339399 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" (UID: "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.339435 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" (UID: "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.339862 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config" (OuterVolumeSpecName: "config") pod "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" (UID: "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.344229 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg" (OuterVolumeSpecName: "kube-api-access-plgfg") pod "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" (UID: "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7"). InnerVolumeSpecName "kube-api-access-plgfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.347234 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" (UID: "bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.440070 4651 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.440109 4651 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.440146 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plgfg\" (UniqueName: \"kubernetes.io/projected/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-kube-api-access-plgfg\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.440157 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.440169 4651 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.571005 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64d476d475-cgxb9"] Nov 26 15:01:18 crc kubenswrapper[4651]: E1126 15:01:18.571276 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerName="controller-manager" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.571298 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerName="controller-manager" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.571401 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerName="controller-manager" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.571762 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.600049 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d476d475-cgxb9"] Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.745547 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0208436-c4c6-472b-8853-d233cd9a6c42-serving-cert\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.745583 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-client-ca\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.745731 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-config\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.745829 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-proxy-ca-bundles\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.745965 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtvss\" (UniqueName: \"kubernetes.io/projected/a0208436-c4c6-472b-8853-d233cd9a6c42-kube-api-access-xtvss\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794584 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-5b6d4956-bfp27_bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7/controller-manager/0.log" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794625 4651 generic.go:334] "Generic (PLEG): container finished" podID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" containerID="ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066" exitCode=2 Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794695 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" event={"ID":"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7","Type":"ContainerDied","Data":"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066"} Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794720 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" event={"ID":"bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7","Type":"ContainerDied","Data":"3248c240d884aedb7d69c3b5d09904bd38ee7ca0a5ed480c95600d1f7f0d56b7"} Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794738 4651 scope.go:117] "RemoveContainer" containerID="ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.794848 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b6d4956-bfp27" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.799843 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" event={"ID":"d4ed5125-11a3-47f7-8646-992b8fb913c0","Type":"ContainerStarted","Data":"ceaf59bbbddf0498f6f64cf73ed4ef033434afe5cd9ae64bb3dd816bede81ec1"} Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.799889 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" event={"ID":"d4ed5125-11a3-47f7-8646-992b8fb913c0","Type":"ContainerStarted","Data":"5d87c6f5619cf6aedc5e73c50444dc766554cf3ef11a77ce31f0bd8e230d277e"} Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.800944 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.819682 4651 scope.go:117] "RemoveContainer" containerID="ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066" Nov 26 15:01:18 crc kubenswrapper[4651]: E1126 15:01:18.820097 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066\": container with ID starting with ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066 not found: ID does not exist" containerID="ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.820140 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066"} err="failed to get container status \"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066\": rpc error: code = NotFound desc = could not find container \"ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066\": container with ID starting with ac7f88cd41d502f24daaf9cc95170ffe8865c9d42e7ce2000b23496312e08066 not found: ID does not exist" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.847262 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtvss\" (UniqueName: \"kubernetes.io/projected/a0208436-c4c6-472b-8853-d233cd9a6c42-kube-api-access-xtvss\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.847339 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0208436-c4c6-472b-8853-d233cd9a6c42-serving-cert\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.847365 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-client-ca\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.847429 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-config\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.847464 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-proxy-ca-bundles\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.848055 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" podStartSLOduration=1.8480153750000001 podStartE2EDuration="1.848015375s" podCreationTimestamp="2025-11-26 15:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:18.833250331 +0000 UTC m=+646.258998205" watchObservedRunningTime="2025-11-26 15:01:18.848015375 +0000 UTC m=+646.273762979" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.848651 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-client-ca\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.849345 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-proxy-ca-bundles\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.849514 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0208436-c4c6-472b-8853-d233cd9a6c42-config\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.851716 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.869843 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0208436-c4c6-472b-8853-d233cd9a6c42-serving-cert\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.871557 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtvss\" (UniqueName: \"kubernetes.io/projected/a0208436-c4c6-472b-8853-d233cd9a6c42-kube-api-access-xtvss\") pod \"controller-manager-64d476d475-cgxb9\" (UID: \"a0208436-c4c6-472b-8853-d233cd9a6c42\") " pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.874379 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b6d4956-bfp27"] Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.885655 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:18 crc kubenswrapper[4651]: I1126 15:01:18.943874 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cdc848476-brbpc" Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.222768 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d476d475-cgxb9"] Nov 26 15:01:19 crc kubenswrapper[4651]: W1126 15:01:19.245404 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0208436_c4c6_472b_8853_d233cd9a6c42.slice/crio-289a46086fa816d3c277fcdd91db0e1c36c36bda46d58dea3a0cc13d850cc601 WatchSource:0}: Error finding container 289a46086fa816d3c277fcdd91db0e1c36c36bda46d58dea3a0cc13d850cc601: Status 404 returned error can't find the container with id 289a46086fa816d3c277fcdd91db0e1c36c36bda46d58dea3a0cc13d850cc601 Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.414299 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7" path="/var/lib/kubelet/pods/bd64d8b5-cf4f-4cc2-b36a-1b235bb307d7/volumes" Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.808651 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" event={"ID":"a0208436-c4c6-472b-8853-d233cd9a6c42","Type":"ContainerStarted","Data":"d87f2dd02a5750c82325dbf47127f7ec1de7a68b764f96c755196b45aaed4cac"} Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.808699 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" event={"ID":"a0208436-c4c6-472b-8853-d233cd9a6c42","Type":"ContainerStarted","Data":"289a46086fa816d3c277fcdd91db0e1c36c36bda46d58dea3a0cc13d850cc601"} Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.810002 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.816184 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" Nov 26 15:01:19 crc kubenswrapper[4651]: I1126 15:01:19.854105 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64d476d475-cgxb9" podStartSLOduration=2.854084797 podStartE2EDuration="2.854084797s" podCreationTimestamp="2025-11-26 15:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:19.839907479 +0000 UTC m=+647.265655103" watchObservedRunningTime="2025-11-26 15:01:19.854084797 +0000 UTC m=+647.279832401" Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.859430 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" event={"ID":"f1830fea-fcaa-4159-a4c9-20787b409237","Type":"ContainerStarted","Data":"480276eb2decb6dfd867c25815afae6945aa5dbe8069e54e9067a6a47b82a836"} Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.860932 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.862643 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerStarted","Data":"836f5fbd351ff1aac2a02ee27c4f73eeccc67aacdd1c8c67b75bb0115f646551"} Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.863106 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.882457 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" podStartSLOduration=2.00337373 podStartE2EDuration="8.882435022s" podCreationTimestamp="2025-11-26 15:01:16 +0000 UTC" firstStartedPulling="2025-11-26 15:01:17.16558004 +0000 UTC m=+644.591327644" lastFinishedPulling="2025-11-26 15:01:24.044641332 +0000 UTC m=+651.470388936" observedRunningTime="2025-11-26 15:01:24.88017872 +0000 UTC m=+652.305926324" watchObservedRunningTime="2025-11-26 15:01:24.882435022 +0000 UTC m=+652.308182636" Nov 26 15:01:24 crc kubenswrapper[4651]: I1126 15:01:24.903829 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podStartSLOduration=2.89403021 podStartE2EDuration="9.903809528s" podCreationTimestamp="2025-11-26 15:01:15 +0000 UTC" firstStartedPulling="2025-11-26 15:01:17.009837319 +0000 UTC m=+644.435584923" lastFinishedPulling="2025-11-26 15:01:24.019616637 +0000 UTC m=+651.445364241" observedRunningTime="2025-11-26 15:01:24.900457486 +0000 UTC m=+652.326205110" watchObservedRunningTime="2025-11-26 15:01:24.903809528 +0000 UTC m=+652.329557132" Nov 26 15:01:29 crc kubenswrapper[4651]: I1126 15:01:29.169875 4651 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 15:01:36 crc kubenswrapper[4651]: I1126 15:01:36.570911 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d67bf6468-cdmmw" Nov 26 15:01:56 crc kubenswrapper[4651]: I1126 15:01:56.380449 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.203693 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.205589 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.218250 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.218331 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ktb\" (UniqueName: \"kubernetes.io/projected/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-kube-api-access-74ktb\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.218714 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.220349 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-x92qn" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.228826 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jbhmh"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.254263 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.254418 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.259599 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.259735 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.319221 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.319419 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74ktb\" (UniqueName: \"kubernetes.io/projected/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-kube-api-access-74ktb\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.319480 4651 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.319732 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert podName:0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29 nodeName:}" failed. No retries permitted until 2025-11-26 15:01:57.81971025 +0000 UTC m=+685.245457914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert") pod "frr-k8s-webhook-server-6998585d5-xrp9z" (UID: "0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29") : secret "frr-k8s-webhook-server-cert" not found Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.352671 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ktb\" (UniqueName: \"kubernetes.io/projected/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-kube-api-access-74ktb\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.383577 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-p9nbz"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.384854 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.387621 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.387650 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.387819 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2sdkq" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.389345 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.420563 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-sockets\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.420859 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-reloader\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.420971 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1482b73c-9ed6-4292-9061-9df617e0f312-metrics-certs\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.421101 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-conf\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.421209 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1482b73c-9ed6-4292-9061-9df617e0f312-frr-startup\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.421356 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-metrics\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.421442 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqkl\" (UniqueName: \"kubernetes.io/projected/1482b73c-9ed6-4292-9061-9df617e0f312-kube-api-access-lpqkl\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.458299 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-f7fh8"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.459372 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.462327 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522661 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-sockets\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522716 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522742 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8pb\" (UniqueName: \"kubernetes.io/projected/05a4e64e-8b51-45a8-be15-4c081281809f-kube-api-access-7h8pb\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522768 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-reloader\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522798 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1482b73c-9ed6-4292-9061-9df617e0f312-metrics-certs\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522817 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-conf\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522835 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522852 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1482b73c-9ed6-4292-9061-9df617e0f312-frr-startup\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522898 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-metrics\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522922 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqkl\" (UniqueName: \"kubernetes.io/projected/1482b73c-9ed6-4292-9061-9df617e0f312-kube-api-access-lpqkl\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.522974 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-cert\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523007 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ck49\" (UniqueName: \"kubernetes.io/projected/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-kube-api-access-9ck49\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523094 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/05a4e64e-8b51-45a8-be15-4c081281809f-metallb-excludel2\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523095 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-sockets\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523130 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-reloader\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523157 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523371 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-frr-conf\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.523612 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1482b73c-9ed6-4292-9061-9df617e0f312-metrics\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.524193 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-f7fh8"] Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.524327 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1482b73c-9ed6-4292-9061-9df617e0f312-frr-startup\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.531838 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1482b73c-9ed6-4292-9061-9df617e0f312-metrics-certs\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.558554 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqkl\" (UniqueName: \"kubernetes.io/projected/1482b73c-9ed6-4292-9061-9df617e0f312-kube-api-access-lpqkl\") pod \"frr-k8s-jbhmh\" (UID: \"1482b73c-9ed6-4292-9061-9df617e0f312\") " pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.580659 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.623478 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.623704 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8pb\" (UniqueName: \"kubernetes.io/projected/05a4e64e-8b51-45a8-be15-4c081281809f-kube-api-access-7h8pb\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.623793 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.623881 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-cert\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.623662 4651 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.623935 4651 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.624061 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs podName:46d32f11-ab11-45c9-8ba1-118d3cf10bcd nodeName:}" failed. No retries permitted until 2025-11-26 15:01:58.12401906 +0000 UTC m=+685.549766754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs") pod "controller-6c7b4b5f48-f7fh8" (UID: "46d32f11-ab11-45c9-8ba1-118d3cf10bcd") : secret "controller-certs-secret" not found Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.624112 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs podName:05a4e64e-8b51-45a8-be15-4c081281809f nodeName:}" failed. No retries permitted until 2025-11-26 15:01:58.124089342 +0000 UTC m=+685.549837046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs") pod "speaker-p9nbz" (UID: "05a4e64e-8b51-45a8-be15-4c081281809f") : secret "speaker-certs-secret" not found Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.623964 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ck49\" (UniqueName: \"kubernetes.io/projected/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-kube-api-access-9ck49\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.624210 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/05a4e64e-8b51-45a8-be15-4c081281809f-metallb-excludel2\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.624282 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.624408 4651 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 15:01:57 crc kubenswrapper[4651]: E1126 15:01:57.624433 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist podName:05a4e64e-8b51-45a8-be15-4c081281809f nodeName:}" failed. No retries permitted until 2025-11-26 15:01:58.124426531 +0000 UTC m=+685.550174135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist") pod "speaker-p9nbz" (UID: "05a4e64e-8b51-45a8-be15-4c081281809f") : secret "metallb-memberlist" not found Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.625205 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/05a4e64e-8b51-45a8-be15-4c081281809f-metallb-excludel2\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.628816 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.638623 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-cert\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.648698 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8pb\" (UniqueName: \"kubernetes.io/projected/05a4e64e-8b51-45a8-be15-4c081281809f-kube-api-access-7h8pb\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.652270 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ck49\" (UniqueName: \"kubernetes.io/projected/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-kube-api-access-9ck49\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.825872 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.829183 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29-cert\") pod \"frr-k8s-webhook-server-6998585d5-xrp9z\" (UID: \"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:57 crc kubenswrapper[4651]: I1126 15:01:57.845105 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.048243 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"1c88b35f20f8e41c9f87eff1c174eacb483be3a763ba132434d3fe5a951f0391"} Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.129717 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.129771 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.129850 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:58 crc kubenswrapper[4651]: E1126 15:01:58.129937 4651 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 15:01:58 crc kubenswrapper[4651]: E1126 15:01:58.129991 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist podName:05a4e64e-8b51-45a8-be15-4c081281809f nodeName:}" failed. No retries permitted until 2025-11-26 15:01:59.129975387 +0000 UTC m=+686.555722991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist") pod "speaker-p9nbz" (UID: "05a4e64e-8b51-45a8-be15-4c081281809f") : secret "metallb-memberlist" not found Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.133149 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-metrics-certs\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.133668 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46d32f11-ab11-45c9-8ba1-118d3cf10bcd-metrics-certs\") pod \"controller-6c7b4b5f48-f7fh8\" (UID: \"46d32f11-ab11-45c9-8ba1-118d3cf10bcd\") " pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.235747 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z"] Nov 26 15:01:58 crc kubenswrapper[4651]: W1126 15:01:58.240542 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2b53ca_9ab2_4845_a2f8_eacbe6fa4e29.slice/crio-b05ae4b10ee6e61681a3d836a1d8b3a6fce9455c39aef823d96701c5e6c49caa WatchSource:0}: Error finding container b05ae4b10ee6e61681a3d836a1d8b3a6fce9455c39aef823d96701c5e6c49caa: Status 404 returned error can't find the container with id b05ae4b10ee6e61681a3d836a1d8b3a6fce9455c39aef823d96701c5e6c49caa Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.373024 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:58 crc kubenswrapper[4651]: I1126 15:01:58.790380 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-f7fh8"] Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.054315 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" event={"ID":"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29","Type":"ContainerStarted","Data":"b05ae4b10ee6e61681a3d836a1d8b3a6fce9455c39aef823d96701c5e6c49caa"} Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.056443 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-f7fh8" event={"ID":"46d32f11-ab11-45c9-8ba1-118d3cf10bcd","Type":"ContainerStarted","Data":"015d773e281bbb3744f12ba2fcc82be8c1d16db93631679ac61ed895ba7026a8"} Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.056488 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-f7fh8" event={"ID":"46d32f11-ab11-45c9-8ba1-118d3cf10bcd","Type":"ContainerStarted","Data":"4f466f30ef4235c1345e025d98920a853ffe6d237200c9ad6fb26390f0ab6a3b"} Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.056502 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-f7fh8" event={"ID":"46d32f11-ab11-45c9-8ba1-118d3cf10bcd","Type":"ContainerStarted","Data":"8c567b8c418fd0532df5b1a914d07aa5bb80f38dc76126d195ef50d9a4375be9"} Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.057348 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.076127 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-f7fh8" podStartSLOduration=2.076109063 podStartE2EDuration="2.076109063s" podCreationTimestamp="2025-11-26 15:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:01:59.074247282 +0000 UTC m=+686.499994896" watchObservedRunningTime="2025-11-26 15:01:59.076109063 +0000 UTC m=+686.501856677" Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.132224 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.132285 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.143011 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.148424 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/05a4e64e-8b51-45a8-be15-4c081281809f-memberlist\") pod \"speaker-p9nbz\" (UID: \"05a4e64e-8b51-45a8-be15-4c081281809f\") " pod="metallb-system/speaker-p9nbz" Nov 26 15:01:59 crc kubenswrapper[4651]: I1126 15:01:59.198843 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-p9nbz" Nov 26 15:01:59 crc kubenswrapper[4651]: W1126 15:01:59.217442 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a4e64e_8b51_45a8_be15_4c081281809f.slice/crio-d3c40c675998c95e9d18ab0b2700b4c815e8f9e4610ef62c2a8c882a2c19a405 WatchSource:0}: Error finding container d3c40c675998c95e9d18ab0b2700b4c815e8f9e4610ef62c2a8c882a2c19a405: Status 404 returned error can't find the container with id d3c40c675998c95e9d18ab0b2700b4c815e8f9e4610ef62c2a8c882a2c19a405 Nov 26 15:02:00 crc kubenswrapper[4651]: I1126 15:02:00.067664 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p9nbz" event={"ID":"05a4e64e-8b51-45a8-be15-4c081281809f","Type":"ContainerStarted","Data":"b6a8d0f60137820bd8d645ef9e8e0db8a0d2858ee8296bd8d13ca4c2b56b1bfa"} Nov 26 15:02:00 crc kubenswrapper[4651]: I1126 15:02:00.067917 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p9nbz" event={"ID":"05a4e64e-8b51-45a8-be15-4c081281809f","Type":"ContainerStarted","Data":"987ee8c1123a400b9b3230c3f5467c8887e24bd2d4c61a65dadb583d9ceac400"} Nov 26 15:02:00 crc kubenswrapper[4651]: I1126 15:02:00.067929 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p9nbz" event={"ID":"05a4e64e-8b51-45a8-be15-4c081281809f","Type":"ContainerStarted","Data":"d3c40c675998c95e9d18ab0b2700b4c815e8f9e4610ef62c2a8c882a2c19a405"} Nov 26 15:02:00 crc kubenswrapper[4651]: I1126 15:02:00.068343 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-p9nbz" Nov 26 15:02:00 crc kubenswrapper[4651]: I1126 15:02:00.097966 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-p9nbz" podStartSLOduration=3.097951451 podStartE2EDuration="3.097951451s" podCreationTimestamp="2025-11-26 15:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:02:00.096362097 +0000 UTC m=+687.522109701" watchObservedRunningTime="2025-11-26 15:02:00.097951451 +0000 UTC m=+687.523699055" Nov 26 15:02:01 crc kubenswrapper[4651]: I1126 15:02:01.912777 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:01 crc kubenswrapper[4651]: I1126 15:02:01.914085 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:01 crc kubenswrapper[4651]: I1126 15:02:01.937089 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.095262 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skjps\" (UniqueName: \"kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.095349 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.095433 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.196605 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.196683 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skjps\" (UniqueName: \"kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.196736 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.197212 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.197278 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.221325 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skjps\" (UniqueName: \"kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps\") pod \"redhat-operators-zcnps\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.231522 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:02 crc kubenswrapper[4651]: I1126 15:02:02.704755 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.086795 4651 generic.go:334] "Generic (PLEG): container finished" podID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerID="a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540" exitCode=0 Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.087090 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerDied","Data":"a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540"} Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.087117 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerStarted","Data":"965bbc06c82c53d3f9a46ca6b3bed3fd8afafedc03e19fa3db540fc7745bfeeb"} Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.691816 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.693063 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.711801 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.819777 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9srh\" (UniqueName: \"kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.819826 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.819879 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.921567 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.921843 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9srh\" (UniqueName: \"kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.921868 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.922364 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.922568 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:03 crc kubenswrapper[4651]: I1126 15:02:03.941777 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9srh\" (UniqueName: \"kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh\") pod \"certified-operators-vk2dq\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:04 crc kubenswrapper[4651]: I1126 15:02:04.012498 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:04 crc kubenswrapper[4651]: I1126 15:02:04.613579 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:04 crc kubenswrapper[4651]: W1126 15:02:04.626127 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0be32e_a5a9_4d99_86a1_bf2ee3ea64aa.slice/crio-1d09d2371b6ae06ece822dfcfd6f7dc4fa3bea057e27983c7c1e52648ac0179b WatchSource:0}: Error finding container 1d09d2371b6ae06ece822dfcfd6f7dc4fa3bea057e27983c7c1e52648ac0179b: Status 404 returned error can't find the container with id 1d09d2371b6ae06ece822dfcfd6f7dc4fa3bea057e27983c7c1e52648ac0179b Nov 26 15:02:05 crc kubenswrapper[4651]: I1126 15:02:05.111579 4651 generic.go:334] "Generic (PLEG): container finished" podID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerID="fb78c32c23295e4e30736bdb762f180a3f0db44e1f19c064ea12bf4d5e63b1e4" exitCode=0 Nov 26 15:02:05 crc kubenswrapper[4651]: I1126 15:02:05.112566 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerDied","Data":"fb78c32c23295e4e30736bdb762f180a3f0db44e1f19c064ea12bf4d5e63b1e4"} Nov 26 15:02:05 crc kubenswrapper[4651]: I1126 15:02:05.112617 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerStarted","Data":"1d09d2371b6ae06ece822dfcfd6f7dc4fa3bea057e27983c7c1e52648ac0179b"} Nov 26 15:02:05 crc kubenswrapper[4651]: I1126 15:02:05.117427 4651 generic.go:334] "Generic (PLEG): container finished" podID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerID="430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9" exitCode=0 Nov 26 15:02:05 crc kubenswrapper[4651]: I1126 15:02:05.117461 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerDied","Data":"430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9"} Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.082763 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.084144 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.105676 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.262103 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.262211 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.262243 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ts4\" (UniqueName: \"kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.364359 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.364975 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.365013 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ts4\" (UniqueName: \"kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.364893 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.365745 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.388199 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ts4\" (UniqueName: \"kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4\") pod \"redhat-marketplace-csn5m\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:06 crc kubenswrapper[4651]: I1126 15:02:06.405246 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.155985 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" event={"ID":"0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29","Type":"ContainerStarted","Data":"0a041f9baced21f4350dbf122ac562dbbb8e9385e49cfae6d3e6de6fea640384"} Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.157361 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.157697 4651 generic.go:334] "Generic (PLEG): container finished" podID="1482b73c-9ed6-4292-9061-9df617e0f312" containerID="ce379667151026e6c7ce4f832ae0cbc119e8606dba59b5d84c61bbba9539c255" exitCode=0 Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.157725 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerDied","Data":"ce379667151026e6c7ce4f832ae0cbc119e8606dba59b5d84c61bbba9539c255"} Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.182204 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" podStartSLOduration=1.5596914100000001 podStartE2EDuration="11.182164964s" podCreationTimestamp="2025-11-26 15:01:57 +0000 UTC" firstStartedPulling="2025-11-26 15:01:58.242418315 +0000 UTC m=+685.668165919" lastFinishedPulling="2025-11-26 15:02:07.864891869 +0000 UTC m=+695.290639473" observedRunningTime="2025-11-26 15:02:08.177486095 +0000 UTC m=+695.603233709" watchObservedRunningTime="2025-11-26 15:02:08.182164964 +0000 UTC m=+695.607912568" Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.235183 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:08 crc kubenswrapper[4651]: W1126 15:02:08.246967 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0397299_e80d_4c6e_9634_b6f64ad039be.slice/crio-d5b54ccae580d46fb969f2630439f39b5bbcd26535f5216766be9efb2032c8a0 WatchSource:0}: Error finding container d5b54ccae580d46fb969f2630439f39b5bbcd26535f5216766be9efb2032c8a0: Status 404 returned error can't find the container with id d5b54ccae580d46fb969f2630439f39b5bbcd26535f5216766be9efb2032c8a0 Nov 26 15:02:08 crc kubenswrapper[4651]: I1126 15:02:08.379535 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-f7fh8" Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.163020 4651 generic.go:334] "Generic (PLEG): container finished" podID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerID="71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c" exitCode=0 Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.164398 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerDied","Data":"71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c"} Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.165203 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerStarted","Data":"d5b54ccae580d46fb969f2630439f39b5bbcd26535f5216766be9efb2032c8a0"} Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.168226 4651 generic.go:334] "Generic (PLEG): container finished" podID="1482b73c-9ed6-4292-9061-9df617e0f312" containerID="11f46ce8cc5256d66b04a5af03c29c05859001eaa52647723619ec61830da445" exitCode=0 Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.168292 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerDied","Data":"11f46ce8cc5256d66b04a5af03c29c05859001eaa52647723619ec61830da445"} Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.171013 4651 generic.go:334] "Generic (PLEG): container finished" podID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerID="429da556ee0f5fa405394d134d3bc67d8104be9f88ab6e8b9cd71391448f7196" exitCode=0 Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.171086 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerDied","Data":"429da556ee0f5fa405394d134d3bc67d8104be9f88ab6e8b9cd71391448f7196"} Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.173773 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerStarted","Data":"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef"} Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.203329 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-p9nbz" Nov 26 15:02:09 crc kubenswrapper[4651]: I1126 15:02:09.207141 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zcnps" podStartSLOduration=3.04202452 podStartE2EDuration="8.207125787s" podCreationTimestamp="2025-11-26 15:02:01 +0000 UTC" firstStartedPulling="2025-11-26 15:02:03.088313356 +0000 UTC m=+690.514060960" lastFinishedPulling="2025-11-26 15:02:08.253414623 +0000 UTC m=+695.679162227" observedRunningTime="2025-11-26 15:02:09.205388299 +0000 UTC m=+696.631135903" watchObservedRunningTime="2025-11-26 15:02:09.207125787 +0000 UTC m=+696.632873391" Nov 26 15:02:10 crc kubenswrapper[4651]: I1126 15:02:10.182755 4651 generic.go:334] "Generic (PLEG): container finished" podID="1482b73c-9ed6-4292-9061-9df617e0f312" containerID="0f3baa884f3cdf3ab0b1621d941b749b52d0517c54d5cb250e7a2586305a8155" exitCode=0 Nov 26 15:02:10 crc kubenswrapper[4651]: I1126 15:02:10.182807 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerDied","Data":"0f3baa884f3cdf3ab0b1621d941b749b52d0517c54d5cb250e7a2586305a8155"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.191615 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerStarted","Data":"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.196928 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"fe91b6ed523eab40fa941e61ca97a4c351f13a2db32824ccc5bc564126ea1cf3"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.196985 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"63cd786e0ea7fcc53f712a6e7ff7611eca51b05af23ad86f2ff660cebfb88a2d"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.197000 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"17d0d6afd65f2e8e29fbe811369ba7ba3f0434bb37ee473d5b3429a4e2fbe8ee"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.197012 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"4f65df817b65b1bd905e76f85aa67c55ed3bd0883d066d2ffde05f7e0e0ec860"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.199714 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerStarted","Data":"ae40902fff531e51a279a534218854f0a632177a817f4f7e13a9bc47502de006"} Nov 26 15:02:11 crc kubenswrapper[4651]: I1126 15:02:11.233746 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vk2dq" podStartSLOduration=5.633841906 podStartE2EDuration="8.233728104s" podCreationTimestamp="2025-11-26 15:02:03 +0000 UTC" firstStartedPulling="2025-11-26 15:02:07.705647852 +0000 UTC m=+695.131395456" lastFinishedPulling="2025-11-26 15:02:10.30553405 +0000 UTC m=+697.731281654" observedRunningTime="2025-11-26 15:02:11.22990717 +0000 UTC m=+698.655654784" watchObservedRunningTime="2025-11-26 15:02:11.233728104 +0000 UTC m=+698.659475708" Nov 26 15:02:12 crc kubenswrapper[4651]: I1126 15:02:12.213941 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"e430c1fe6e031c75ebb90f475da3a9171b8fe8f3faf6feb7650a3b520e187c6a"} Nov 26 15:02:12 crc kubenswrapper[4651]: I1126 15:02:12.213981 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbhmh" event={"ID":"1482b73c-9ed6-4292-9061-9df617e0f312","Type":"ContainerStarted","Data":"17a5e1dca030e18566089c36719106201739e63557f410298c02f59a982d3ad3"} Nov 26 15:02:12 crc kubenswrapper[4651]: I1126 15:02:12.232187 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:12 crc kubenswrapper[4651]: I1126 15:02:12.232247 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:13 crc kubenswrapper[4651]: I1126 15:02:13.223769 4651 generic.go:334] "Generic (PLEG): container finished" podID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerID="a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54" exitCode=0 Nov 26 15:02:13 crc kubenswrapper[4651]: I1126 15:02:13.223909 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerDied","Data":"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54"} Nov 26 15:02:13 crc kubenswrapper[4651]: I1126 15:02:13.224388 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:02:13 crc kubenswrapper[4651]: I1126 15:02:13.262677 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jbhmh" podStartSLOduration=6.149315425 podStartE2EDuration="16.262658275s" podCreationTimestamp="2025-11-26 15:01:57 +0000 UTC" firstStartedPulling="2025-11-26 15:01:57.710225609 +0000 UTC m=+685.135973213" lastFinishedPulling="2025-11-26 15:02:07.823568459 +0000 UTC m=+695.249316063" observedRunningTime="2025-11-26 15:02:13.257181225 +0000 UTC m=+700.682928839" watchObservedRunningTime="2025-11-26 15:02:13.262658275 +0000 UTC m=+700.688405879" Nov 26 15:02:13 crc kubenswrapper[4651]: I1126 15:02:13.291423 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zcnps" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="registry-server" probeResult="failure" output=< Nov 26 15:02:13 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:02:13 crc kubenswrapper[4651]: > Nov 26 15:02:14 crc kubenswrapper[4651]: I1126 15:02:14.016310 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:14 crc kubenswrapper[4651]: I1126 15:02:14.016345 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:14 crc kubenswrapper[4651]: I1126 15:02:14.081104 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:15 crc kubenswrapper[4651]: I1126 15:02:15.237223 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerStarted","Data":"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50"} Nov 26 15:02:15 crc kubenswrapper[4651]: I1126 15:02:15.258766 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-csn5m" podStartSLOduration=3.869092173 podStartE2EDuration="9.258749177s" podCreationTimestamp="2025-11-26 15:02:06 +0000 UTC" firstStartedPulling="2025-11-26 15:02:09.165560959 +0000 UTC m=+696.591308563" lastFinishedPulling="2025-11-26 15:02:14.555217963 +0000 UTC m=+701.980965567" observedRunningTime="2025-11-26 15:02:15.256698721 +0000 UTC m=+702.682446335" watchObservedRunningTime="2025-11-26 15:02:15.258749177 +0000 UTC m=+702.684496781" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.406082 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.406365 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.458862 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.682783 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lmwxn"] Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.685865 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.696892 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.697011 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.697514 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-p822z" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.725195 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lmwxn"] Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.737636 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nk2b\" (UniqueName: \"kubernetes.io/projected/58c9e481-e131-494a-808a-c27aaae0ebaa-kube-api-access-6nk2b\") pod \"openstack-operator-index-lmwxn\" (UID: \"58c9e481-e131-494a-808a-c27aaae0ebaa\") " pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.838735 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nk2b\" (UniqueName: \"kubernetes.io/projected/58c9e481-e131-494a-808a-c27aaae0ebaa-kube-api-access-6nk2b\") pod \"openstack-operator-index-lmwxn\" (UID: \"58c9e481-e131-494a-808a-c27aaae0ebaa\") " pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:16 crc kubenswrapper[4651]: I1126 15:02:16.861138 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nk2b\" (UniqueName: \"kubernetes.io/projected/58c9e481-e131-494a-808a-c27aaae0ebaa-kube-api-access-6nk2b\") pod \"openstack-operator-index-lmwxn\" (UID: \"58c9e481-e131-494a-808a-c27aaae0ebaa\") " pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:17 crc kubenswrapper[4651]: I1126 15:02:17.021338 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:17 crc kubenswrapper[4651]: I1126 15:02:17.445249 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lmwxn"] Nov 26 15:02:17 crc kubenswrapper[4651]: W1126 15:02:17.453612 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c9e481_e131_494a_808a_c27aaae0ebaa.slice/crio-1bd746446f30df4af36ca54e21da2b3bf979ca302d984d6655a3a03522514ed5 WatchSource:0}: Error finding container 1bd746446f30df4af36ca54e21da2b3bf979ca302d984d6655a3a03522514ed5: Status 404 returned error can't find the container with id 1bd746446f30df4af36ca54e21da2b3bf979ca302d984d6655a3a03522514ed5 Nov 26 15:02:17 crc kubenswrapper[4651]: I1126 15:02:17.581739 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:02:17 crc kubenswrapper[4651]: I1126 15:02:17.628773 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:02:18 crc kubenswrapper[4651]: I1126 15:02:18.260292 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lmwxn" event={"ID":"58c9e481-e131-494a-808a-c27aaae0ebaa","Type":"ContainerStarted","Data":"1bd746446f30df4af36ca54e21da2b3bf979ca302d984d6655a3a03522514ed5"} Nov 26 15:02:21 crc kubenswrapper[4651]: I1126 15:02:21.279709 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lmwxn" event={"ID":"58c9e481-e131-494a-808a-c27aaae0ebaa","Type":"ContainerStarted","Data":"1d445bb82554829a4bb7a9b6c38d6ae03b176ebac0055beb070ea36469b70f2d"} Nov 26 15:02:21 crc kubenswrapper[4651]: I1126 15:02:21.334806 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lmwxn" podStartSLOduration=2.3075056 podStartE2EDuration="5.334782926s" podCreationTimestamp="2025-11-26 15:02:16 +0000 UTC" firstStartedPulling="2025-11-26 15:02:17.45838775 +0000 UTC m=+704.884135344" lastFinishedPulling="2025-11-26 15:02:20.485665066 +0000 UTC m=+707.911412670" observedRunningTime="2025-11-26 15:02:21.331387163 +0000 UTC m=+708.757134787" watchObservedRunningTime="2025-11-26 15:02:21.334782926 +0000 UTC m=+708.760530530" Nov 26 15:02:22 crc kubenswrapper[4651]: I1126 15:02:22.279209 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:22 crc kubenswrapper[4651]: I1126 15:02:22.320084 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:24 crc kubenswrapper[4651]: I1126 15:02:24.064789 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:26 crc kubenswrapper[4651]: I1126 15:02:26.464734 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:26 crc kubenswrapper[4651]: I1126 15:02:26.673323 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:26 crc kubenswrapper[4651]: I1126 15:02:26.673758 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zcnps" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="registry-server" containerID="cri-o://d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef" gracePeriod=2 Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.022436 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.022506 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.059760 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.111815 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.203846 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities\") pod \"5282ea53-ddaf-4f19-863f-d22f4cde4570\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.203886 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content\") pod \"5282ea53-ddaf-4f19-863f-d22f4cde4570\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.204065 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skjps\" (UniqueName: \"kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps\") pod \"5282ea53-ddaf-4f19-863f-d22f4cde4570\" (UID: \"5282ea53-ddaf-4f19-863f-d22f4cde4570\") " Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.204811 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities" (OuterVolumeSpecName: "utilities") pod "5282ea53-ddaf-4f19-863f-d22f4cde4570" (UID: "5282ea53-ddaf-4f19-863f-d22f4cde4570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.210207 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps" (OuterVolumeSpecName: "kube-api-access-skjps") pod "5282ea53-ddaf-4f19-863f-d22f4cde4570" (UID: "5282ea53-ddaf-4f19-863f-d22f4cde4570"). InnerVolumeSpecName "kube-api-access-skjps". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.302608 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5282ea53-ddaf-4f19-863f-d22f4cde4570" (UID: "5282ea53-ddaf-4f19-863f-d22f4cde4570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.304973 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skjps\" (UniqueName: \"kubernetes.io/projected/5282ea53-ddaf-4f19-863f-d22f4cde4570-kube-api-access-skjps\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.305005 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.305018 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5282ea53-ddaf-4f19-863f-d22f4cde4570-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.321501 4651 generic.go:334] "Generic (PLEG): container finished" podID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerID="d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef" exitCode=0 Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.321605 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zcnps" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.321586 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerDied","Data":"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef"} Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.321672 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zcnps" event={"ID":"5282ea53-ddaf-4f19-863f-d22f4cde4570","Type":"ContainerDied","Data":"965bbc06c82c53d3f9a46ca6b3bed3fd8afafedc03e19fa3db540fc7745bfeeb"} Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.321696 4651 scope.go:117] "RemoveContainer" containerID="d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.342128 4651 scope.go:117] "RemoveContainer" containerID="430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.356135 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.356183 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zcnps"] Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.365185 4651 scope.go:117] "RemoveContainer" containerID="a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.369458 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lmwxn" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.380868 4651 scope.go:117] "RemoveContainer" containerID="d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef" Nov 26 15:02:27 crc kubenswrapper[4651]: E1126 15:02:27.381297 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef\": container with ID starting with d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef not found: ID does not exist" containerID="d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.381338 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef"} err="failed to get container status \"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef\": rpc error: code = NotFound desc = could not find container \"d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef\": container with ID starting with d79db415f9a90d5b3ea5e4e72f48bd943b0685c79060e11481f959f675e783ef not found: ID does not exist" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.381364 4651 scope.go:117] "RemoveContainer" containerID="430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9" Nov 26 15:02:27 crc kubenswrapper[4651]: E1126 15:02:27.382498 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9\": container with ID starting with 430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9 not found: ID does not exist" containerID="430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.382620 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9"} err="failed to get container status \"430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9\": rpc error: code = NotFound desc = could not find container \"430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9\": container with ID starting with 430ccb255a261177a9348e088185799a5cad0473301b5fe9cc0af84c4e38a3b9 not found: ID does not exist" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.382724 4651 scope.go:117] "RemoveContainer" containerID="a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540" Nov 26 15:02:27 crc kubenswrapper[4651]: E1126 15:02:27.384306 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540\": container with ID starting with a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540 not found: ID does not exist" containerID="a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.384420 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540"} err="failed to get container status \"a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540\": rpc error: code = NotFound desc = could not find container \"a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540\": container with ID starting with a7a5ccdc6d1c5f92a3fc84086e8ea2a797642895b9585a66b8b41c3f4fc2b540 not found: ID does not exist" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.416106 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" path="/var/lib/kubelet/pods/5282ea53-ddaf-4f19-863f-d22f4cde4570/volumes" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.585002 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jbhmh" Nov 26 15:02:27 crc kubenswrapper[4651]: I1126 15:02:27.858505 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-xrp9z" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.072466 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.073530 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vk2dq" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="registry-server" containerID="cri-o://ae40902fff531e51a279a534218854f0a632177a817f4f7e13a9bc47502de006" gracePeriod=2 Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.132180 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.132246 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.351076 4651 generic.go:334] "Generic (PLEG): container finished" podID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerID="ae40902fff531e51a279a534218854f0a632177a817f4f7e13a9bc47502de006" exitCode=0 Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.351235 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerDied","Data":"ae40902fff531e51a279a534218854f0a632177a817f4f7e13a9bc47502de006"} Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.517914 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.532196 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9srh\" (UniqueName: \"kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh\") pod \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.568268 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh" (OuterVolumeSpecName: "kube-api-access-d9srh") pod "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" (UID: "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa"). InnerVolumeSpecName "kube-api-access-d9srh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.632726 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities\") pod \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.632789 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content\") pod \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\" (UID: \"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa\") " Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.632990 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9srh\" (UniqueName: \"kubernetes.io/projected/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-kube-api-access-d9srh\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.633696 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities" (OuterVolumeSpecName: "utilities") pod "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" (UID: "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.685481 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" (UID: "9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.734072 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:29 crc kubenswrapper[4651]: I1126 15:02:29.734097 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.361848 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk2dq" event={"ID":"9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa","Type":"ContainerDied","Data":"1d09d2371b6ae06ece822dfcfd6f7dc4fa3bea057e27983c7c1e52648ac0179b"} Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.361901 4651 scope.go:117] "RemoveContainer" containerID="ae40902fff531e51a279a534218854f0a632177a817f4f7e13a9bc47502de006" Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.361930 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk2dq" Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.387695 4651 scope.go:117] "RemoveContainer" containerID="429da556ee0f5fa405394d134d3bc67d8104be9f88ab6e8b9cd71391448f7196" Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.413144 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.418023 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vk2dq"] Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.430631 4651 scope.go:117] "RemoveContainer" containerID="fb78c32c23295e4e30736bdb762f180a3f0db44e1f19c064ea12bf4d5e63b1e4" Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.675552 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:30 crc kubenswrapper[4651]: I1126 15:02:30.675824 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-csn5m" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="registry-server" containerID="cri-o://f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50" gracePeriod=2 Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.043681 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.152410 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content\") pod \"d0397299-e80d-4c6e-9634-b6f64ad039be\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.152469 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9ts4\" (UniqueName: \"kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4\") pod \"d0397299-e80d-4c6e-9634-b6f64ad039be\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.152587 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities\") pod \"d0397299-e80d-4c6e-9634-b6f64ad039be\" (UID: \"d0397299-e80d-4c6e-9634-b6f64ad039be\") " Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.153576 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities" (OuterVolumeSpecName: "utilities") pod "d0397299-e80d-4c6e-9634-b6f64ad039be" (UID: "d0397299-e80d-4c6e-9634-b6f64ad039be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.157167 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4" (OuterVolumeSpecName: "kube-api-access-s9ts4") pod "d0397299-e80d-4c6e-9634-b6f64ad039be" (UID: "d0397299-e80d-4c6e-9634-b6f64ad039be"). InnerVolumeSpecName "kube-api-access-s9ts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.167869 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0397299-e80d-4c6e-9634-b6f64ad039be" (UID: "d0397299-e80d-4c6e-9634-b6f64ad039be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.254268 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.254302 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0397299-e80d-4c6e-9634-b6f64ad039be-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.254316 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9ts4\" (UniqueName: \"kubernetes.io/projected/d0397299-e80d-4c6e-9634-b6f64ad039be-kube-api-access-s9ts4\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.371829 4651 generic.go:334] "Generic (PLEG): container finished" podID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerID="f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50" exitCode=0 Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.371920 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-csn5m" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.372352 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerDied","Data":"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50"} Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.372438 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-csn5m" event={"ID":"d0397299-e80d-4c6e-9634-b6f64ad039be","Type":"ContainerDied","Data":"d5b54ccae580d46fb969f2630439f39b5bbcd26535f5216766be9efb2032c8a0"} Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.372464 4651 scope.go:117] "RemoveContainer" containerID="f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.392427 4651 scope.go:117] "RemoveContainer" containerID="a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.410107 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" path="/var/lib/kubelet/pods/9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa/volumes" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.410620 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.410646 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-csn5m"] Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.426445 4651 scope.go:117] "RemoveContainer" containerID="71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.441257 4651 scope.go:117] "RemoveContainer" containerID="f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.441741 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50\": container with ID starting with f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50 not found: ID does not exist" containerID="f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.441787 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50"} err="failed to get container status \"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50\": rpc error: code = NotFound desc = could not find container \"f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50\": container with ID starting with f6ca8940578b90c57c1f3a439d8fe5d1edaca86d00604abb67b033d67ade4d50 not found: ID does not exist" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.441820 4651 scope.go:117] "RemoveContainer" containerID="a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.442133 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54\": container with ID starting with a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54 not found: ID does not exist" containerID="a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.442164 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54"} err="failed to get container status \"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54\": rpc error: code = NotFound desc = could not find container \"a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54\": container with ID starting with a3ed73ccc9f0a457da82a8e2f3b8498d63a54ac79817646dfddf1b6318f67e54 not found: ID does not exist" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.442182 4651 scope.go:117] "RemoveContainer" containerID="71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.442444 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c\": container with ID starting with 71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c not found: ID does not exist" containerID="71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.442487 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c"} err="failed to get container status \"71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c\": rpc error: code = NotFound desc = could not find container \"71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c\": container with ID starting with 71bd23ed8905dd028dd0856901b418682a8a740a93d5cad3652020d16c107f4c not found: ID does not exist" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.707527 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f"] Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708063 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708077 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708089 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708095 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708103 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708110 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708121 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708126 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708133 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708138 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708154 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708160 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="extract-content" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708167 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708173 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708181 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708187 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: E1126 15:02:31.708192 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708198 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="extract-utilities" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708292 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708301 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5282ea53-ddaf-4f19-863f-d22f4cde4570" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.708316 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0be32e-a5a9-4d99-86a1-bf2ee3ea64aa" containerName="registry-server" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.709118 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.711149 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9zznm" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.719190 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f"] Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.759436 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.759480 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.759507 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrgv\" (UniqueName: \"kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.860196 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.860243 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.860273 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrgv\" (UniqueName: \"kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.860741 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.860792 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:31 crc kubenswrapper[4651]: I1126 15:02:31.879230 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrgv\" (UniqueName: \"kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv\") pod \"7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:32 crc kubenswrapper[4651]: I1126 15:02:32.025540 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:32 crc kubenswrapper[4651]: I1126 15:02:32.443684 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f"] Nov 26 15:02:33 crc kubenswrapper[4651]: I1126 15:02:33.386321 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerID="a3c002b17b5c31246ec50fe29e3ca414b5fec5a46717dc31fa1569723cf0ce4b" exitCode=0 Nov 26 15:02:33 crc kubenswrapper[4651]: I1126 15:02:33.386414 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" event={"ID":"ca662d04-4f23-48dc-b58c-d96bb9d5073c","Type":"ContainerDied","Data":"a3c002b17b5c31246ec50fe29e3ca414b5fec5a46717dc31fa1569723cf0ce4b"} Nov 26 15:02:33 crc kubenswrapper[4651]: I1126 15:02:33.386648 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" event={"ID":"ca662d04-4f23-48dc-b58c-d96bb9d5073c","Type":"ContainerStarted","Data":"89930e40f4d31d69ad5e705582140fdaf73838a543fbe9344958cd2a57af7d54"} Nov 26 15:02:33 crc kubenswrapper[4651]: I1126 15:02:33.410421 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0397299-e80d-4c6e-9634-b6f64ad039be" path="/var/lib/kubelet/pods/d0397299-e80d-4c6e-9634-b6f64ad039be/volumes" Nov 26 15:02:34 crc kubenswrapper[4651]: I1126 15:02:34.393675 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerID="0870c2a1422b4e9cb65b59a03064f2a306ac8a054367842a8c28931136fe8f8c" exitCode=0 Nov 26 15:02:34 crc kubenswrapper[4651]: I1126 15:02:34.393988 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" event={"ID":"ca662d04-4f23-48dc-b58c-d96bb9d5073c","Type":"ContainerDied","Data":"0870c2a1422b4e9cb65b59a03064f2a306ac8a054367842a8c28931136fe8f8c"} Nov 26 15:02:35 crc kubenswrapper[4651]: I1126 15:02:35.401140 4651 generic.go:334] "Generic (PLEG): container finished" podID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerID="db42d31f31b5cc10a1b0a27fc594111c086173029c735692100838b0448b6d04" exitCode=0 Nov 26 15:02:35 crc kubenswrapper[4651]: I1126 15:02:35.408430 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" event={"ID":"ca662d04-4f23-48dc-b58c-d96bb9d5073c","Type":"ContainerDied","Data":"db42d31f31b5cc10a1b0a27fc594111c086173029c735692100838b0448b6d04"} Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.725394 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.921806 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle\") pod \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.922785 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle" (OuterVolumeSpecName: "bundle") pod "ca662d04-4f23-48dc-b58c-d96bb9d5073c" (UID: "ca662d04-4f23-48dc-b58c-d96bb9d5073c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.923654 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgrgv\" (UniqueName: \"kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv\") pod \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.923732 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util\") pod \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\" (UID: \"ca662d04-4f23-48dc-b58c-d96bb9d5073c\") " Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.924384 4651 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.931310 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv" (OuterVolumeSpecName: "kube-api-access-lgrgv") pod "ca662d04-4f23-48dc-b58c-d96bb9d5073c" (UID: "ca662d04-4f23-48dc-b58c-d96bb9d5073c"). InnerVolumeSpecName "kube-api-access-lgrgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:02:36 crc kubenswrapper[4651]: I1126 15:02:36.937930 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util" (OuterVolumeSpecName: "util") pod "ca662d04-4f23-48dc-b58c-d96bb9d5073c" (UID: "ca662d04-4f23-48dc-b58c-d96bb9d5073c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:37 crc kubenswrapper[4651]: I1126 15:02:37.025874 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgrgv\" (UniqueName: \"kubernetes.io/projected/ca662d04-4f23-48dc-b58c-d96bb9d5073c-kube-api-access-lgrgv\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:37 crc kubenswrapper[4651]: I1126 15:02:37.025914 4651 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca662d04-4f23-48dc-b58c-d96bb9d5073c-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:37 crc kubenswrapper[4651]: I1126 15:02:37.416409 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" event={"ID":"ca662d04-4f23-48dc-b58c-d96bb9d5073c","Type":"ContainerDied","Data":"89930e40f4d31d69ad5e705582140fdaf73838a543fbe9344958cd2a57af7d54"} Nov 26 15:02:37 crc kubenswrapper[4651]: I1126 15:02:37.416455 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89930e40f4d31d69ad5e705582140fdaf73838a543fbe9344958cd2a57af7d54" Nov 26 15:02:37 crc kubenswrapper[4651]: I1126 15:02:37.416490 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.879221 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:39 crc kubenswrapper[4651]: E1126 15:02:39.879734 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="extract" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.879746 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="extract" Nov 26 15:02:39 crc kubenswrapper[4651]: E1126 15:02:39.879757 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="util" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.879789 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="util" Nov 26 15:02:39 crc kubenswrapper[4651]: E1126 15:02:39.879803 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="pull" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.879810 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="pull" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.879929 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca662d04-4f23-48dc-b58c-d96bb9d5073c" containerName="extract" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.880879 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.892551 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.960969 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.961091 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjr9\" (UniqueName: \"kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:39 crc kubenswrapper[4651]: I1126 15:02:39.961109 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.062293 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjr9\" (UniqueName: \"kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.062339 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.062390 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.062942 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.062972 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.081878 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjr9\" (UniqueName: \"kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9\") pod \"community-operators-b4rj4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.240364 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.363182 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c"] Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.364028 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.368348 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5r9\" (UniqueName: \"kubernetes.io/projected/674eb001-765e-433a-89d6-2a82fb599a93-kube-api-access-px5r9\") pod \"openstack-operator-controller-operator-6b4f979c6c-lg95c\" (UID: \"674eb001-765e-433a-89d6-2a82fb599a93\") " pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.375532 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gdbpc" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.474315 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5r9\" (UniqueName: \"kubernetes.io/projected/674eb001-765e-433a-89d6-2a82fb599a93-kube-api-access-px5r9\") pod \"openstack-operator-controller-operator-6b4f979c6c-lg95c\" (UID: \"674eb001-765e-433a-89d6-2a82fb599a93\") " pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.488577 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c"] Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.512650 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5r9\" (UniqueName: \"kubernetes.io/projected/674eb001-765e-433a-89d6-2a82fb599a93-kube-api-access-px5r9\") pod \"openstack-operator-controller-operator-6b4f979c6c-lg95c\" (UID: \"674eb001-765e-433a-89d6-2a82fb599a93\") " pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.678724 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:40 crc kubenswrapper[4651]: I1126 15:02:40.868809 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:41 crc kubenswrapper[4651]: I1126 15:02:41.135643 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c"] Nov 26 15:02:41 crc kubenswrapper[4651]: I1126 15:02:41.459431 4651 generic.go:334] "Generic (PLEG): container finished" podID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerID="b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017" exitCode=0 Nov 26 15:02:41 crc kubenswrapper[4651]: I1126 15:02:41.460108 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerDied","Data":"b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017"} Nov 26 15:02:41 crc kubenswrapper[4651]: I1126 15:02:41.460133 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerStarted","Data":"3286b2fe2a8bb50dd4765b76749d96b0871ff5e0de7122345698ee947a3cc9e4"} Nov 26 15:02:41 crc kubenswrapper[4651]: I1126 15:02:41.463064 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" event={"ID":"674eb001-765e-433a-89d6-2a82fb599a93","Type":"ContainerStarted","Data":"b8034c013b9b23a2def07106056b9bc5f966bc4fb8088ab6475fe8f3769f3019"} Nov 26 15:02:42 crc kubenswrapper[4651]: I1126 15:02:42.477275 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerStarted","Data":"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea"} Nov 26 15:02:43 crc kubenswrapper[4651]: I1126 15:02:43.484227 4651 generic.go:334] "Generic (PLEG): container finished" podID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerID="80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea" exitCode=0 Nov 26 15:02:43 crc kubenswrapper[4651]: I1126 15:02:43.484510 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerDied","Data":"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea"} Nov 26 15:02:46 crc kubenswrapper[4651]: I1126 15:02:46.516369 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerStarted","Data":"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0"} Nov 26 15:02:46 crc kubenswrapper[4651]: I1126 15:02:46.518784 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" event={"ID":"674eb001-765e-433a-89d6-2a82fb599a93","Type":"ContainerStarted","Data":"c563184a72db972980c8e7bee4ee080ab41751f8ee8d68204bfab0c762a9a579"} Nov 26 15:02:46 crc kubenswrapper[4651]: I1126 15:02:46.519334 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:46 crc kubenswrapper[4651]: I1126 15:02:46.542374 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b4rj4" podStartSLOduration=3.586945656 podStartE2EDuration="7.542354217s" podCreationTimestamp="2025-11-26 15:02:39 +0000 UTC" firstStartedPulling="2025-11-26 15:02:41.461140666 +0000 UTC m=+728.886888270" lastFinishedPulling="2025-11-26 15:02:45.416549227 +0000 UTC m=+732.842296831" observedRunningTime="2025-11-26 15:02:46.539019948 +0000 UTC m=+733.964767552" watchObservedRunningTime="2025-11-26 15:02:46.542354217 +0000 UTC m=+733.968101831" Nov 26 15:02:46 crc kubenswrapper[4651]: I1126 15:02:46.582943 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" podStartSLOduration=2.305701117 podStartE2EDuration="6.582922982s" podCreationTimestamp="2025-11-26 15:02:40 +0000 UTC" firstStartedPulling="2025-11-26 15:02:41.153551562 +0000 UTC m=+728.579299166" lastFinishedPulling="2025-11-26 15:02:45.430773427 +0000 UTC m=+732.856521031" observedRunningTime="2025-11-26 15:02:46.579975192 +0000 UTC m=+734.005722836" watchObservedRunningTime="2025-11-26 15:02:46.582922982 +0000 UTC m=+734.008670596" Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.241027 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.241436 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.291968 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.577727 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.617156 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:50 crc kubenswrapper[4651]: I1126 15:02:50.681446 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:02:52 crc kubenswrapper[4651]: I1126 15:02:52.550956 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b4rj4" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="registry-server" containerID="cri-o://a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0" gracePeriod=2 Nov 26 15:02:52 crc kubenswrapper[4651]: I1126 15:02:52.948578 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.052987 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content\") pod \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.053344 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wjr9\" (UniqueName: \"kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9\") pod \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.053376 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities\") pod \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\" (UID: \"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4\") " Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.054266 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities" (OuterVolumeSpecName: "utilities") pod "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" (UID: "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.062316 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9" (OuterVolumeSpecName: "kube-api-access-9wjr9") pod "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" (UID: "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4"). InnerVolumeSpecName "kube-api-access-9wjr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.154176 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.154212 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wjr9\" (UniqueName: \"kubernetes.io/projected/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-kube-api-access-9wjr9\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.558275 4651 generic.go:334] "Generic (PLEG): container finished" podID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerID="a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0" exitCode=0 Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.558325 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerDied","Data":"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0"} Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.558359 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4rj4" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.558374 4651 scope.go:117] "RemoveContainer" containerID="a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.558362 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4rj4" event={"ID":"5799d7fe-eb0a-46ba-867d-8f7082c3b3a4","Type":"ContainerDied","Data":"3286b2fe2a8bb50dd4765b76749d96b0871ff5e0de7122345698ee947a3cc9e4"} Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.586753 4651 scope.go:117] "RemoveContainer" containerID="80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.603326 4651 scope.go:117] "RemoveContainer" containerID="b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.633810 4651 scope.go:117] "RemoveContainer" containerID="a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0" Nov 26 15:02:53 crc kubenswrapper[4651]: E1126 15:02:53.639594 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0\": container with ID starting with a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0 not found: ID does not exist" containerID="a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.639655 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0"} err="failed to get container status \"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0\": rpc error: code = NotFound desc = could not find container \"a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0\": container with ID starting with a475a28df36480feed387bafc30c84f4465d2cb1e8ab7038585146405dd0d4e0 not found: ID does not exist" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.639689 4651 scope.go:117] "RemoveContainer" containerID="80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea" Nov 26 15:02:53 crc kubenswrapper[4651]: E1126 15:02:53.640350 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea\": container with ID starting with 80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea not found: ID does not exist" containerID="80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.640432 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea"} err="failed to get container status \"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea\": rpc error: code = NotFound desc = could not find container \"80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea\": container with ID starting with 80f723a7707212dd38bd485357d30bef0b713ec0f0781cc4fa2a8f22241ba9ea not found: ID does not exist" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.640486 4651 scope.go:117] "RemoveContainer" containerID="b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017" Nov 26 15:02:53 crc kubenswrapper[4651]: E1126 15:02:53.641506 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017\": container with ID starting with b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017 not found: ID does not exist" containerID="b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.641541 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017"} err="failed to get container status \"b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017\": rpc error: code = NotFound desc = could not find container \"b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017\": container with ID starting with b51e491b68400fa3c29e1ac0a35fb2f394d3dfac7928cc6739031cc2d1afe017 not found: ID does not exist" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.642099 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" (UID: "5799d7fe-eb0a-46ba-867d-8f7082c3b3a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.677723 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.889545 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:53 crc kubenswrapper[4651]: I1126 15:02:53.896181 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b4rj4"] Nov 26 15:02:55 crc kubenswrapper[4651]: I1126 15:02:55.410062 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" path="/var/lib/kubelet/pods/5799d7fe-eb0a-46ba-867d-8f7082c3b3a4/volumes" Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.132388 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.132707 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.132952 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.133505 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.133566 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a" gracePeriod=600 Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.593145 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a" exitCode=0 Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.593217 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a"} Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.593499 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36"} Nov 26 15:02:59 crc kubenswrapper[4651]: I1126 15:02:59.593522 4651 scope.go:117] "RemoveContainer" containerID="77c8189b80a1a06a684db450cc919068d52888695cc9756916189ce184f0c190" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.888690 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9"] Nov 26 15:03:06 crc kubenswrapper[4651]: E1126 15:03:06.889557 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="extract-content" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.889575 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="extract-content" Nov 26 15:03:06 crc kubenswrapper[4651]: E1126 15:03:06.889599 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="extract-utilities" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.889607 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="extract-utilities" Nov 26 15:03:06 crc kubenswrapper[4651]: E1126 15:03:06.889629 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="registry-server" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.889639 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="registry-server" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.889787 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5799d7fe-eb0a-46ba-867d-8f7082c3b3a4" containerName="registry-server" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.890552 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.894123 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x"] Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.894997 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.895712 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4tjkf" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.896743 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-rvhpg" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.920154 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9"] Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.931924 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-q8cjf"] Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.932988 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.938843 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r9nfj" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.947341 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p"] Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.953388 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:06 crc kubenswrapper[4651]: I1126 15:03:06.958628 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jwdhh" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.007855 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmt6\" (UniqueName: \"kubernetes.io/projected/85fb4e98-47db-403d-85e3-c2550cd47160-kube-api-access-znmt6\") pod \"cinder-operator-controller-manager-6b7f75547b-k4tq9\" (UID: \"85fb4e98-47db-403d-85e3-c2550cd47160\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.007906 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsj8\" (UniqueName: \"kubernetes.io/projected/ec10af15-dcf5-413d-87ef-0ca5a469b5fa-kube-api-access-nmsj8\") pod \"barbican-operator-controller-manager-7b64f4fb85-5jb5x\" (UID: \"ec10af15-dcf5-413d-87ef-0ca5a469b5fa\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.007943 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4rmg\" (UniqueName: \"kubernetes.io/projected/6a660fe2-a185-4e56-98cb-b12cdd749964-kube-api-access-m4rmg\") pod \"glance-operator-controller-manager-589cbd6b5b-gqj7p\" (UID: \"6a660fe2-a185-4e56-98cb-b12cdd749964\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.007976 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qw2g\" (UniqueName: \"kubernetes.io/projected/5f58ef49-d516-48e5-a508-e4102374d111-kube-api-access-5qw2g\") pod \"designate-operator-controller-manager-955677c94-q8cjf\" (UID: \"5f58ef49-d516-48e5-a508-e4102374d111\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.024126 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.047116 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.065064 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.066298 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.071641 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hsft9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.078659 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-q8cjf"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.087524 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.088776 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.093605 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-shslt"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.094945 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.099722 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-n9s69" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.100007 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lb8mv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.100174 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.108352 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.108847 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qw2g\" (UniqueName: \"kubernetes.io/projected/5f58ef49-d516-48e5-a508-e4102374d111-kube-api-access-5qw2g\") pod \"designate-operator-controller-manager-955677c94-q8cjf\" (UID: \"5f58ef49-d516-48e5-a508-e4102374d111\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.108942 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmt6\" (UniqueName: \"kubernetes.io/projected/85fb4e98-47db-403d-85e3-c2550cd47160-kube-api-access-znmt6\") pod \"cinder-operator-controller-manager-6b7f75547b-k4tq9\" (UID: \"85fb4e98-47db-403d-85e3-c2550cd47160\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.108988 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsj8\" (UniqueName: \"kubernetes.io/projected/ec10af15-dcf5-413d-87ef-0ca5a469b5fa-kube-api-access-nmsj8\") pod \"barbican-operator-controller-manager-7b64f4fb85-5jb5x\" (UID: \"ec10af15-dcf5-413d-87ef-0ca5a469b5fa\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.109068 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rmg\" (UniqueName: \"kubernetes.io/projected/6a660fe2-a185-4e56-98cb-b12cdd749964-kube-api-access-m4rmg\") pod \"glance-operator-controller-manager-589cbd6b5b-gqj7p\" (UID: \"6a660fe2-a185-4e56-98cb-b12cdd749964\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.113596 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.118118 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-shslt"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.124231 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.125272 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.129388 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-msrsw" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.137661 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.138564 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.158341 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.169115 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-jdjct" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.173220 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.197896 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.198824 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsj8\" (UniqueName: \"kubernetes.io/projected/ec10af15-dcf5-413d-87ef-0ca5a469b5fa-kube-api-access-nmsj8\") pod \"barbican-operator-controller-manager-7b64f4fb85-5jb5x\" (UID: \"ec10af15-dcf5-413d-87ef-0ca5a469b5fa\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.198893 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.199320 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rmg\" (UniqueName: \"kubernetes.io/projected/6a660fe2-a185-4e56-98cb-b12cdd749964-kube-api-access-m4rmg\") pod \"glance-operator-controller-manager-589cbd6b5b-gqj7p\" (UID: \"6a660fe2-a185-4e56-98cb-b12cdd749964\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.199519 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmt6\" (UniqueName: \"kubernetes.io/projected/85fb4e98-47db-403d-85e3-c2550cd47160-kube-api-access-znmt6\") pod \"cinder-operator-controller-manager-6b7f75547b-k4tq9\" (UID: \"85fb4e98-47db-403d-85e3-c2550cd47160\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.200071 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qw2g\" (UniqueName: \"kubernetes.io/projected/5f58ef49-d516-48e5-a508-e4102374d111-kube-api-access-5qw2g\") pod \"designate-operator-controller-manager-955677c94-q8cjf\" (UID: \"5f58ef49-d516-48e5-a508-e4102374d111\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.202783 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.203765 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.206842 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6r9kk" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.208715 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-db2f4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211441 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhwm4\" (UniqueName: \"kubernetes.io/projected/dc5a51cf-b992-4542-8b00-2948ab513eed-kube-api-access-jhwm4\") pod \"keystone-operator-controller-manager-7b4567c7cf-hmndm\" (UID: \"dc5a51cf-b992-4542-8b00-2948ab513eed\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211504 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211526 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtg5d\" (UniqueName: \"kubernetes.io/projected/eed373f0-add9-4ae8-b5cc-ed711e79b5c5-kube-api-access-jtg5d\") pod \"horizon-operator-controller-manager-5d494799bf-v89cv\" (UID: \"eed373f0-add9-4ae8-b5cc-ed711e79b5c5\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211548 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcb8\" (UniqueName: \"kubernetes.io/projected/e5c0812c-3183-4f45-b6b9-d4975f8bb80a-kube-api-access-cxcb8\") pod \"heat-operator-controller-manager-5b77f656f-pt9q8\" (UID: \"e5c0812c-3183-4f45-b6b9-d4975f8bb80a\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211572 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhldx\" (UniqueName: \"kubernetes.io/projected/14110a58-3dd5-4827-8a86-d4c0fc377b97-kube-api-access-hhldx\") pod \"ironic-operator-controller-manager-67cb4dc6d4-cggjs\" (UID: \"14110a58-3dd5-4827-8a86-d4c0fc377b97\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211603 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwrc\" (UniqueName: \"kubernetes.io/projected/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-kube-api-access-nxwrc\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.211713 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.220620 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.226348 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.261654 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.271116 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.272492 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.292265 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sk4zf" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.296869 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.303005 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314267 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwrc\" (UniqueName: \"kubernetes.io/projected/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-kube-api-access-nxwrc\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314368 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhwm4\" (UniqueName: \"kubernetes.io/projected/dc5a51cf-b992-4542-8b00-2948ab513eed-kube-api-access-jhwm4\") pod \"keystone-operator-controller-manager-7b4567c7cf-hmndm\" (UID: \"dc5a51cf-b992-4542-8b00-2948ab513eed\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314431 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kczl\" (UniqueName: \"kubernetes.io/projected/53400076-0e4e-4e0b-b476-d4a1fd901631-kube-api-access-8kczl\") pod \"manila-operator-controller-manager-5d499bf58b-tszf4\" (UID: \"53400076-0e4e-4e0b-b476-d4a1fd901631\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314472 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314499 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtg5d\" (UniqueName: \"kubernetes.io/projected/eed373f0-add9-4ae8-b5cc-ed711e79b5c5-kube-api-access-jtg5d\") pod \"horizon-operator-controller-manager-5d494799bf-v89cv\" (UID: \"eed373f0-add9-4ae8-b5cc-ed711e79b5c5\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314531 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcb8\" (UniqueName: \"kubernetes.io/projected/e5c0812c-3183-4f45-b6b9-d4975f8bb80a-kube-api-access-cxcb8\") pod \"heat-operator-controller-manager-5b77f656f-pt9q8\" (UID: \"e5c0812c-3183-4f45-b6b9-d4975f8bb80a\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314561 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhldx\" (UniqueName: \"kubernetes.io/projected/14110a58-3dd5-4827-8a86-d4c0fc377b97-kube-api-access-hhldx\") pod \"ironic-operator-controller-manager-67cb4dc6d4-cggjs\" (UID: \"14110a58-3dd5-4827-8a86-d4c0fc377b97\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.314607 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmk9b\" (UniqueName: \"kubernetes.io/projected/8a55643f-68a5-47ea-8b27-db437d3af215-kube-api-access-pmk9b\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-ffbs5\" (UID: \"8a55643f-68a5-47ea-8b27-db437d3af215\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:07 crc kubenswrapper[4651]: E1126 15:03:07.315229 4651 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:07 crc kubenswrapper[4651]: E1126 15:03:07.315289 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert podName:99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:07.81527066 +0000 UTC m=+755.241018264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert") pod "infra-operator-controller-manager-57548d458d-shslt" (UID: "99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6") : secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.324873 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gk4z8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.325188 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.333172 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.337961 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwrc\" (UniqueName: \"kubernetes.io/projected/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-kube-api-access-nxwrc\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.353824 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.364080 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtg5d\" (UniqueName: \"kubernetes.io/projected/eed373f0-add9-4ae8-b5cc-ed711e79b5c5-kube-api-access-jtg5d\") pod \"horizon-operator-controller-manager-5d494799bf-v89cv\" (UID: \"eed373f0-add9-4ae8-b5cc-ed711e79b5c5\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.381537 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhwm4\" (UniqueName: \"kubernetes.io/projected/dc5a51cf-b992-4542-8b00-2948ab513eed-kube-api-access-jhwm4\") pod \"keystone-operator-controller-manager-7b4567c7cf-hmndm\" (UID: \"dc5a51cf-b992-4542-8b00-2948ab513eed\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.396289 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhldx\" (UniqueName: \"kubernetes.io/projected/14110a58-3dd5-4827-8a86-d4c0fc377b97-kube-api-access-hhldx\") pod \"ironic-operator-controller-manager-67cb4dc6d4-cggjs\" (UID: \"14110a58-3dd5-4827-8a86-d4c0fc377b97\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.417239 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q898b\" (UniqueName: \"kubernetes.io/projected/e9981be4-751d-4c74-894a-698adad4c50f-kube-api-access-q898b\") pod \"nova-operator-controller-manager-79556f57fc-cnwcz\" (UID: \"e9981be4-751d-4c74-894a-698adad4c50f\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.417289 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmk9b\" (UniqueName: \"kubernetes.io/projected/8a55643f-68a5-47ea-8b27-db437d3af215-kube-api-access-pmk9b\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-ffbs5\" (UID: \"8a55643f-68a5-47ea-8b27-db437d3af215\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.417327 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/8271ec0d-f8ea-4c46-984f-95572691a379-kube-api-access-d2kwc\") pod \"neutron-operator-controller-manager-6fdcddb789-8h624\" (UID: \"8271ec0d-f8ea-4c46-984f-95572691a379\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.417378 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kczl\" (UniqueName: \"kubernetes.io/projected/53400076-0e4e-4e0b-b476-d4a1fd901631-kube-api-access-8kczl\") pod \"manila-operator-controller-manager-5d499bf58b-tszf4\" (UID: \"53400076-0e4e-4e0b-b476-d4a1fd901631\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.429414 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.439358 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.462721 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kczl\" (UniqueName: \"kubernetes.io/projected/53400076-0e4e-4e0b-b476-d4a1fd901631-kube-api-access-8kczl\") pod \"manila-operator-controller-manager-5d499bf58b-tszf4\" (UID: \"53400076-0e4e-4e0b-b476-d4a1fd901631\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.463094 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.484442 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.492135 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmk9b\" (UniqueName: \"kubernetes.io/projected/8a55643f-68a5-47ea-8b27-db437d3af215-kube-api-access-pmk9b\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-ffbs5\" (UID: \"8a55643f-68a5-47ea-8b27-db437d3af215\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.499554 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcb8\" (UniqueName: \"kubernetes.io/projected/e5c0812c-3183-4f45-b6b9-d4975f8bb80a-kube-api-access-cxcb8\") pod \"heat-operator-controller-manager-5b77f656f-pt9q8\" (UID: \"e5c0812c-3183-4f45-b6b9-d4975f8bb80a\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.522339 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.523201 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.524366 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q898b\" (UniqueName: \"kubernetes.io/projected/e9981be4-751d-4c74-894a-698adad4c50f-kube-api-access-q898b\") pod \"nova-operator-controller-manager-79556f57fc-cnwcz\" (UID: \"e9981be4-751d-4c74-894a-698adad4c50f\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.524416 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/8271ec0d-f8ea-4c46-984f-95572691a379-kube-api-access-d2kwc\") pod \"neutron-operator-controller-manager-6fdcddb789-8h624\" (UID: \"8271ec0d-f8ea-4c46-984f-95572691a379\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.527736 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-89fgx" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.532016 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.554104 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.564727 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q898b\" (UniqueName: \"kubernetes.io/projected/e9981be4-751d-4c74-894a-698adad4c50f-kube-api-access-q898b\") pod \"nova-operator-controller-manager-79556f57fc-cnwcz\" (UID: \"e9981be4-751d-4c74-894a-698adad4c50f\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.568542 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/8271ec0d-f8ea-4c46-984f-95572691a379-kube-api-access-d2kwc\") pod \"neutron-operator-controller-manager-6fdcddb789-8h624\" (UID: \"8271ec0d-f8ea-4c46-984f-95572691a379\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.572945 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.575749 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.579988 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nwqlt" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.638215 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.659980 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.684694 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.692595 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.693591 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.708315 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.736194 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hcd7m" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.743360 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.744953 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h98d\" (UniqueName: \"kubernetes.io/projected/ce4c06a7-4bcb-4167-bec1-14a45ca24bea-kube-api-access-2h98d\") pod \"ovn-operator-controller-manager-56897c768d-k2rdd\" (UID: \"ce4c06a7-4bcb-4167-bec1-14a45ca24bea\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.744979 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dc8j\" (UniqueName: \"kubernetes.io/projected/b24122be-246e-4dc9-a3ad-4ca2392a4660-kube-api-access-2dc8j\") pod \"octavia-operator-controller-manager-64cdc6ff96-x9mdd\" (UID: \"b24122be-246e-4dc9-a3ad-4ca2392a4660\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.809087 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.810667 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.823831 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zdppk" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.823996 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.850069 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.852340 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.853898 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h98d\" (UniqueName: \"kubernetes.io/projected/ce4c06a7-4bcb-4167-bec1-14a45ca24bea-kube-api-access-2h98d\") pod \"ovn-operator-controller-manager-56897c768d-k2rdd\" (UID: \"ce4c06a7-4bcb-4167-bec1-14a45ca24bea\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.853934 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dc8j\" (UniqueName: \"kubernetes.io/projected/b24122be-246e-4dc9-a3ad-4ca2392a4660-kube-api-access-2dc8j\") pod \"octavia-operator-controller-manager-64cdc6ff96-x9mdd\" (UID: \"b24122be-246e-4dc9-a3ad-4ca2392a4660\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.853994 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkrc\" (UniqueName: \"kubernetes.io/projected/a8e49781-2e0b-476d-be9f-e17f05639447-kube-api-access-5zkrc\") pod \"placement-operator-controller-manager-57988cc5b5-269d2\" (UID: \"a8e49781-2e0b-476d-be9f-e17f05639447\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.854053 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:07 crc kubenswrapper[4651]: E1126 15:03:07.854215 4651 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:07 crc kubenswrapper[4651]: E1126 15:03:07.854264 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert podName:99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:08.8542461 +0000 UTC m=+756.279993704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert") pod "infra-operator-controller-manager-57548d458d-shslt" (UID: "99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6") : secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.858135 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.865278 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2d62w" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.874211 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.885427 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.904732 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dc8j\" (UniqueName: \"kubernetes.io/projected/b24122be-246e-4dc9-a3ad-4ca2392a4660-kube-api-access-2dc8j\") pod \"octavia-operator-controller-manager-64cdc6ff96-x9mdd\" (UID: \"b24122be-246e-4dc9-a3ad-4ca2392a4660\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.920620 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h98d\" (UniqueName: \"kubernetes.io/projected/ce4c06a7-4bcb-4167-bec1-14a45ca24bea-kube-api-access-2h98d\") pod \"ovn-operator-controller-manager-56897c768d-k2rdd\" (UID: \"ce4c06a7-4bcb-4167-bec1-14a45ca24bea\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.947527 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.952530 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.953604 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.956262 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.956302 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p47ts\" (UniqueName: \"kubernetes.io/projected/719afb5d-40c4-4fa3-b030-38c170fc7dbb-kube-api-access-p47ts\") pod \"swift-operator-controller-manager-d77b94747-6kjgs\" (UID: \"719afb5d-40c4-4fa3-b030-38c170fc7dbb\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.956335 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkrc\" (UniqueName: \"kubernetes.io/projected/a8e49781-2e0b-476d-be9f-e17f05639447-kube-api-access-5zkrc\") pod \"placement-operator-controller-manager-57988cc5b5-269d2\" (UID: \"a8e49781-2e0b-476d-be9f-e17f05639447\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.956407 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffz9\" (UniqueName: \"kubernetes.io/projected/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-kube-api-access-2ffz9\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.962921 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gnqpg" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.987507 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkrc\" (UniqueName: \"kubernetes.io/projected/a8e49781-2e0b-476d-be9f-e17f05639447-kube-api-access-5zkrc\") pod \"placement-operator-controller-manager-57988cc5b5-269d2\" (UID: \"a8e49781-2e0b-476d-be9f-e17f05639447\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.989081 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb"] Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.990152 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.992368 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t7x8w" Nov 26 15:03:07 crc kubenswrapper[4651]: I1126 15:03:07.996793 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.022119 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.023495 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.026126 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xtz7s" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.031427 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.038789 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.058438 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffz9\" (UniqueName: \"kubernetes.io/projected/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-kube-api-access-2ffz9\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.058534 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jjds\" (UniqueName: \"kubernetes.io/projected/8cd427a2-9759-460e-b86e-23e08dd7ba78-kube-api-access-7jjds\") pod \"test-operator-controller-manager-5cd6c7f4c8-l2z9w\" (UID: \"8cd427a2-9759-460e-b86e-23e08dd7ba78\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.058581 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.058646 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p47ts\" (UniqueName: \"kubernetes.io/projected/719afb5d-40c4-4fa3-b030-38c170fc7dbb-kube-api-access-p47ts\") pod \"swift-operator-controller-manager-d77b94747-6kjgs\" (UID: \"719afb5d-40c4-4fa3-b030-38c170fc7dbb\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.058725 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.058754 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wbf\" (UniqueName: \"kubernetes.io/projected/66532d04-3411-4813-ae53-4d635ee98911-kube-api-access-w9wbf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8zvlb\" (UID: \"66532d04-3411-4813-ae53-4d635ee98911\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.058809 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:08.558792219 +0000 UTC m=+755.984539823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.060076 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.063510 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.068232 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.069170 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.069382 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gsfrd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.069460 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.076895 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.077715 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.077849 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.088846 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.090256 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sdtmc" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.096322 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p47ts\" (UniqueName: \"kubernetes.io/projected/719afb5d-40c4-4fa3-b030-38c170fc7dbb-kube-api-access-p47ts\") pod \"swift-operator-controller-manager-d77b94747-6kjgs\" (UID: \"719afb5d-40c4-4fa3-b030-38c170fc7dbb\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.097091 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffz9\" (UniqueName: \"kubernetes.io/projected/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-kube-api-access-2ffz9\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164021 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164153 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrz2q\" (UniqueName: \"kubernetes.io/projected/a72e6d14-1571-4b70-b872-a4a4b0b3c242-kube-api-access-qrz2q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwjsd\" (UID: \"a72e6d14-1571-4b70-b872-a4a4b0b3c242\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164216 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jjds\" (UniqueName: \"kubernetes.io/projected/8cd427a2-9759-460e-b86e-23e08dd7ba78-kube-api-access-7jjds\") pod \"test-operator-controller-manager-5cd6c7f4c8-l2z9w\" (UID: \"8cd427a2-9759-460e-b86e-23e08dd7ba78\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164258 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bch62\" (UniqueName: \"kubernetes.io/projected/e8ad6eac-027c-4615-a5dd-6facdc1db056-kube-api-access-bch62\") pod \"watcher-operator-controller-manager-656dcb59d4-s5dd9\" (UID: \"e8ad6eac-027c-4615-a5dd-6facdc1db056\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164355 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164422 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wjh\" (UniqueName: \"kubernetes.io/projected/e50a607f-7a61-4a78-870a-297fa0daa977-kube-api-access-89wjh\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.164481 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wbf\" (UniqueName: \"kubernetes.io/projected/66532d04-3411-4813-ae53-4d635ee98911-kube-api-access-w9wbf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8zvlb\" (UID: \"66532d04-3411-4813-ae53-4d635ee98911\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.179919 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.180543 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.184680 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jjds\" (UniqueName: \"kubernetes.io/projected/8cd427a2-9759-460e-b86e-23e08dd7ba78-kube-api-access-7jjds\") pod \"test-operator-controller-manager-5cd6c7f4c8-l2z9w\" (UID: \"8cd427a2-9759-460e-b86e-23e08dd7ba78\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.190193 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wbf\" (UniqueName: \"kubernetes.io/projected/66532d04-3411-4813-ae53-4d635ee98911-kube-api-access-w9wbf\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8zvlb\" (UID: \"66532d04-3411-4813-ae53-4d635ee98911\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.249057 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.268202 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.268555 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrz2q\" (UniqueName: \"kubernetes.io/projected/a72e6d14-1571-4b70-b872-a4a4b0b3c242-kube-api-access-qrz2q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwjsd\" (UID: \"a72e6d14-1571-4b70-b872-a4a4b0b3c242\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.272254 4651 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.272334 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:08.772315567 +0000 UTC m=+756.198063171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.275639 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bch62\" (UniqueName: \"kubernetes.io/projected/e8ad6eac-027c-4615-a5dd-6facdc1db056-kube-api-access-bch62\") pod \"watcher-operator-controller-manager-656dcb59d4-s5dd9\" (UID: \"e8ad6eac-027c-4615-a5dd-6facdc1db056\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.275760 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.275819 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89wjh\" (UniqueName: \"kubernetes.io/projected/e50a607f-7a61-4a78-870a-297fa0daa977-kube-api-access-89wjh\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.276462 4651 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.276503 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:08.776488138 +0000 UTC m=+756.202235742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "metrics-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.290793 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrz2q\" (UniqueName: \"kubernetes.io/projected/a72e6d14-1571-4b70-b872-a4a4b0b3c242-kube-api-access-qrz2q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wwjsd\" (UID: \"a72e6d14-1571-4b70-b872-a4a4b0b3c242\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.310903 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wjh\" (UniqueName: \"kubernetes.io/projected/e50a607f-7a61-4a78-870a-297fa0daa977-kube-api-access-89wjh\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.314755 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bch62\" (UniqueName: \"kubernetes.io/projected/e8ad6eac-027c-4615-a5dd-6facdc1db056-kube-api-access-bch62\") pod \"watcher-operator-controller-manager-656dcb59d4-s5dd9\" (UID: \"e8ad6eac-027c-4615-a5dd-6facdc1db056\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:08 crc kubenswrapper[4651]: W1126 15:03:08.344215 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec10af15_dcf5_413d_87ef_0ca5a469b5fa.slice/crio-16189edd6517631e11d74fe7024c82d72736b426a0911897ed839edd5725c9cb WatchSource:0}: Error finding container 16189edd6517631e11d74fe7024c82d72736b426a0911897ed839edd5725c9cb: Status 404 returned error can't find the container with id 16189edd6517631e11d74fe7024c82d72736b426a0911897ed839edd5725c9cb Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.349146 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.364546 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.401866 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.425303 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.517058 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.528815 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.537810 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-q8cjf"] Nov 26 15:03:08 crc kubenswrapper[4651]: W1126 15:03:08.564772 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f58ef49_d516_48e5_a508_e4102374d111.slice/crio-58b8f7741e4d165a03c515440f32d8194705bcf5e0f8bba7e68a1d7c43e3f4b5 WatchSource:0}: Error finding container 58b8f7741e4d165a03c515440f32d8194705bcf5e0f8bba7e68a1d7c43e3f4b5: Status 404 returned error can't find the container with id 58b8f7741e4d165a03c515440f32d8194705bcf5e0f8bba7e68a1d7c43e3f4b5 Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.585378 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.585564 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.585623 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:09.585602833 +0000 UTC m=+757.011350437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.706832 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerStarted","Data":"c5b70c421fc2a07c208ec10b535088b5aa6d8e18b5506e24719c5566f2c9b4cf"} Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.712141 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerStarted","Data":"16189edd6517631e11d74fe7024c82d72736b426a0911897ed839edd5725c9cb"} Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.713124 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerStarted","Data":"e6a7f1e516a17dc0d3d2a15fad97912ab9fdb99de484a73686792aeeb71f14de"} Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.714211 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerStarted","Data":"58b8f7741e4d165a03c515440f32d8194705bcf5e0f8bba7e68a1d7c43e3f4b5"} Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.747303 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm"] Nov 26 15:03:08 crc kubenswrapper[4651]: W1126 15:03:08.754961 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5a51cf_b992_4542_8b00_2948ab513eed.slice/crio-abcc5e0894b52477f9e4c1eac0d9432c78ee12ac0217c1ca6cc2e362b85cd347 WatchSource:0}: Error finding container abcc5e0894b52477f9e4c1eac0d9432c78ee12ac0217c1ca6cc2e362b85cd347: Status 404 returned error can't find the container with id abcc5e0894b52477f9e4c1eac0d9432c78ee12ac0217c1ca6cc2e362b85cd347 Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.762114 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv"] Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.788700 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.788816 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.788893 4651 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.788966 4651 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.788984 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:09.78896363 +0000 UTC m=+757.214711234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "metrics-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.789005 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:09.788993961 +0000 UTC m=+757.214741565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: I1126 15:03:08.890329 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.890505 4651 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:08 crc kubenswrapper[4651]: E1126 15:03:08.890559 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert podName:99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:10.890542546 +0000 UTC m=+758.316290160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert") pod "infra-operator-controller-manager-57548d458d-shslt" (UID: "99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6") : secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.032080 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4"] Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.043147 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14110a58_3dd5_4827_8a86_d4c0fc377b97.slice/crio-df44cfdbbfc58d50167489b3dbdba32f12200b5d1d4aaa5e73b97262d6e78aea WatchSource:0}: Error finding container df44cfdbbfc58d50167489b3dbdba32f12200b5d1d4aaa5e73b97262d6e78aea: Status 404 returned error can't find the container with id df44cfdbbfc58d50167489b3dbdba32f12200b5d1d4aaa5e73b97262d6e78aea Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.046379 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53400076_0e4e_4e0b_b476_d4a1fd901631.slice/crio-3459b86aef8b02ae6d1126f8c06ced422d879081f1cc1f349f7aa3f2ddbe11c5 WatchSource:0}: Error finding container 3459b86aef8b02ae6d1126f8c06ced422d879081f1cc1f349f7aa3f2ddbe11c5: Status 404 returned error can't find the container with id 3459b86aef8b02ae6d1126f8c06ced422d879081f1cc1f349f7aa3f2ddbe11c5 Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.048807 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.171740 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.250503 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.262376 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.288304 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2"] Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.313132 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4c06a7_4bcb_4167_bec1_14a45ca24bea.slice/crio-b84edc5b60b2d805c2ce3e1c0e444446ba1e6e24c4441bd68f071168d061fc1f WatchSource:0}: Error finding container b84edc5b60b2d805c2ce3e1c0e444446ba1e6e24c4441bd68f071168d061fc1f: Status 404 returned error can't find the container with id b84edc5b60b2d805c2ce3e1c0e444446ba1e6e24c4441bd68f071168d061fc1f Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.337505 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e49781_2e0b_476d_be9f_e17f05639447.slice/crio-c21bc648ec967a9fe3411c72fb6763116f38628c7d6c21f6f0b616f69f455fa5 WatchSource:0}: Error finding container c21bc648ec967a9fe3411c72fb6763116f38628c7d6c21f6f0b616f69f455fa5: Status 404 returned error can't find the container with id c21bc648ec967a9fe3411c72fb6763116f38628c7d6c21f6f0b616f69f455fa5 Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.341223 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.346569 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.382109 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb"] Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.386517 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66532d04_3411_4813_ae53_4d635ee98911.slice/crio-1ff058938085a1b11a24b07a8660c6fa1cc93ead8a7593a86b042fc952e4c98b WatchSource:0}: Error finding container 1ff058938085a1b11a24b07a8660c6fa1cc93ead8a7593a86b042fc952e4c98b: Status 404 returned error can't find the container with id 1ff058938085a1b11a24b07a8660c6fa1cc93ead8a7593a86b042fc952e4c98b Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.452976 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd"] Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.453012 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9"] Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.462354 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bch62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-s5dd9_openstack-operators(e8ad6eac-027c-4615-a5dd-6facdc1db056): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.466656 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bch62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-s5dd9_openstack-operators(e8ad6eac-027c-4615-a5dd-6facdc1db056): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.467742 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.476417 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jjds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-l2z9w_openstack-operators(8cd427a2-9759-460e-b86e-23e08dd7ba78): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.481191 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jjds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-l2z9w_openstack-operators(8cd427a2-9759-460e-b86e-23e08dd7ba78): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.483651 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" podUID="8cd427a2-9759-460e-b86e-23e08dd7ba78" Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.490871 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb24122be_246e_4dc9_a3ad_4ca2392a4660.slice/crio-3880445631f218e55d9786db87e428d436b889964c0e4641e8f56e848e4f63ee WatchSource:0}: Error finding container 3880445631f218e55d9786db87e428d436b889964c0e4641e8f56e848e4f63ee: Status 404 returned error can't find the container with id 3880445631f218e55d9786db87e428d436b889964c0e4641e8f56e848e4f63ee Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.492777 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs"] Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.496611 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dc8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.497830 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd"] Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.498560 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dc8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.499688 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.505195 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p47ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.505358 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w"] Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.507470 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p47ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: W1126 15:03:09.508476 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda72e6d14_1571_4b70_b872_a4a4b0b3c242.slice/crio-cd41ac2b4e40f4d967b1ba400214c605652ce866f1f4f54e31ac0aae42a54a9f WatchSource:0}: Error finding container cd41ac2b4e40f4d967b1ba400214c605652ce866f1f4f54e31ac0aae42a54a9f: Status 404 returned error can't find the container with id cd41ac2b4e40f4d967b1ba400214c605652ce866f1f4f54e31ac0aae42a54a9f Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.508534 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.511465 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrz2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wwjsd_openstack-operators(a72e6d14-1571-4b70-b872-a4a4b0b3c242): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.512807 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.599901 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.600153 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.600200 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:11.600185159 +0000 UTC m=+759.025932763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.721977 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerStarted","Data":"c0f15aaee9edaaa4b5b7427f51f4ca89d204819544c0d1fcf4b5d2e1e524a1a2"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.723061 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerStarted","Data":"b9d3426ad0a35d993d20d12c9cb972d70de5462828eee0fcb19d2ac1333b12bd"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.724083 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" event={"ID":"8cd427a2-9759-460e-b86e-23e08dd7ba78","Type":"ContainerStarted","Data":"3134a8c8655363f3bb854e36221851195b16ee9275a55ebb0a2f93d6f4c3f8f2"} Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.726003 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.726137 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerStarted","Data":"8c120e6e4b30a5d1e33e515030788b6ee555acec6440c7fd86b2bcb3500a10a4"} Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.726924 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" podUID="8cd427a2-9759-460e-b86e-23e08dd7ba78" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.727108 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerStarted","Data":"3880445631f218e55d9786db87e428d436b889964c0e4641e8f56e848e4f63ee"} Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.728519 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.728985 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerStarted","Data":"df44cfdbbfc58d50167489b3dbdba32f12200b5d1d4aaa5e73b97262d6e78aea"} Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.729706 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.730146 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerStarted","Data":"cd41ac2b4e40f4d967b1ba400214c605652ce866f1f4f54e31ac0aae42a54a9f"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.730952 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerStarted","Data":"1ff058938085a1b11a24b07a8660c6fa1cc93ead8a7593a86b042fc952e4c98b"} Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.733518 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.734803 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerStarted","Data":"b84edc5b60b2d805c2ce3e1c0e444446ba1e6e24c4441bd68f071168d061fc1f"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.737062 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerStarted","Data":"bb18d5a373222e4a2b73bd1a792a61bf213193c16eb357191c4e336df6c25c67"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.741629 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerStarted","Data":"abcc5e0894b52477f9e4c1eac0d9432c78ee12ac0217c1ca6cc2e362b85cd347"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.766475 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerStarted","Data":"a1230a38e8caa9531b666f756b9c7122357e904a328d822db96b011f57254881"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.770438 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerStarted","Data":"df02fc6c66b7bc650ac83eaa0e13ab913705fd953f6a5e7e2eec0126cd89e56d"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.784568 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerStarted","Data":"da1df53366017aaf5a61b3719d9355caeac8c6eb3685acbdd003139620c0364e"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.792537 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerStarted","Data":"c21bc648ec967a9fe3411c72fb6763116f38628c7d6c21f6f0b616f69f455fa5"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.803091 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerStarted","Data":"3459b86aef8b02ae6d1126f8c06ced422d879081f1cc1f349f7aa3f2ddbe11c5"} Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.806877 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:09 crc kubenswrapper[4651]: I1126 15:03:09.807025 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.807167 4651 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.807207 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:11.807193544 +0000 UTC m=+759.232941148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "metrics-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.807462 4651 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 15:03:09 crc kubenswrapper[4651]: E1126 15:03:09.807484 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:11.807477591 +0000 UTC m=+759.233225185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "webhook-server-cert" not found Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.842722 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.849373 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.851692 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.851797 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" podUID="8cd427a2-9759-460e-b86e-23e08dd7ba78" Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.859259 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:03:10 crc kubenswrapper[4651]: I1126 15:03:10.929968 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.930146 4651 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:10 crc kubenswrapper[4651]: E1126 15:03:10.930193 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert podName:99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:14.930179128 +0000 UTC m=+762.355926732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert") pod "infra-operator-controller-manager-57548d458d-shslt" (UID: "99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6") : secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: I1126 15:03:11.638640 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.638855 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.638953 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:15.638932707 +0000 UTC m=+763.064680351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: I1126 15:03:11.840776 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.840933 4651 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.841260 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:15.841237846 +0000 UTC m=+763.266985450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "webhook-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: I1126 15:03:11.841280 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.841450 4651 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 15:03:11 crc kubenswrapper[4651]: E1126 15:03:11.841492 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:15.841482262 +0000 UTC m=+763.267229866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "metrics-server-cert" not found Nov 26 15:03:14 crc kubenswrapper[4651]: I1126 15:03:14.986954 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:14 crc kubenswrapper[4651]: E1126 15:03:14.987173 4651 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:14 crc kubenswrapper[4651]: E1126 15:03:14.987624 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert podName:99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:22.987605886 +0000 UTC m=+770.413353490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert") pod "infra-operator-controller-manager-57548d458d-shslt" (UID: "99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6") : secret "infra-operator-webhook-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: I1126 15:03:15.697242 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.697436 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.697519 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:23.697502106 +0000 UTC m=+771.123249710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: I1126 15:03:15.900618 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:15 crc kubenswrapper[4651]: I1126 15:03:15.900724 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.900785 4651 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.900832 4651 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.900847 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:23.900832542 +0000 UTC m=+771.326580146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "metrics-server-cert" not found Nov 26 15:03:15 crc kubenswrapper[4651]: E1126 15:03:15.900865 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs podName:e50a607f-7a61-4a78-870a-297fa0daa977 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:23.900854253 +0000 UTC m=+771.326601857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs") pod "openstack-operator-controller-manager-5bcdd9fbc-vsb4g" (UID: "e50a607f-7a61-4a78-870a-297fa0daa977") : secret "webhook-server-cert" not found Nov 26 15:03:21 crc kubenswrapper[4651]: E1126 15:03:21.248437 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7" Nov 26 15:03:21 crc kubenswrapper[4651]: E1126 15:03:21.249307 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q898b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-cnwcz_openstack-operators(e9981be4-751d-4c74-894a-698adad4c50f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:22 crc kubenswrapper[4651]: E1126 15:03:22.329273 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2ee37ff474bee3203447df4f326a9279a515e770573153338296dd074722c677" Nov 26 15:03:22 crc kubenswrapper[4651]: E1126 15:03:22.329457 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2ee37ff474bee3203447df4f326a9279a515e770573153338296dd074722c677,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cxcb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5b77f656f-pt9q8_openstack-operators(e5c0812c-3183-4f45-b6b9-d4975f8bb80a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.005023 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.029739 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6-cert\") pod \"infra-operator-controller-manager-57548d458d-shslt\" (UID: \"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.051487 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.715661 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:23 crc kubenswrapper[4651]: E1126 15:03:23.715854 4651 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:23 crc kubenswrapper[4651]: E1126 15:03:23.715915 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert podName:6b7bc81d-5bbe-4c1b-a512-93e75a1f7035 nodeName:}" failed. No retries permitted until 2025-11-26 15:03:39.715900246 +0000 UTC m=+787.141647850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" (UID: "6b7bc81d-5bbe-4c1b-a512-93e75a1f7035") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.918631 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.918789 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.925129 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-webhook-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:23 crc kubenswrapper[4651]: I1126 15:03:23.932006 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e50a607f-7a61-4a78-870a-297fa0daa977-metrics-certs\") pod \"openstack-operator-controller-manager-5bcdd9fbc-vsb4g\" (UID: \"e50a607f-7a61-4a78-870a-297fa0daa977\") " pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:23 crc kubenswrapper[4651]: E1126 15:03:23.946971 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3dbf9fd9dce75f1fb250ee4c4097ad77d2f34110b61d85e37abd9c472e022e6c" Nov 26 15:03:23 crc kubenswrapper[4651]: E1126 15:03:23.947210 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3dbf9fd9dce75f1fb250ee4c4097ad77d2f34110b61d85e37abd9c472e022e6c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nmsj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7b64f4fb85-5jb5x_openstack-operators(ec10af15-dcf5-413d-87ef-0ca5a469b5fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:24 crc kubenswrapper[4651]: I1126 15:03:24.011724 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:25 crc kubenswrapper[4651]: E1126 15:03:25.039898 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9413ed1bc2ae1a6bd28c59b1c7f7e91e1638de7b2a7d4729ed3fa2135182465d" Nov 26 15:03:25 crc kubenswrapper[4651]: E1126 15:03:25.040809 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9413ed1bc2ae1a6bd28c59b1c7f7e91e1638de7b2a7d4729ed3fa2135182465d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jtg5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5d494799bf-v89cv_openstack-operators(eed373f0-add9-4ae8-b5cc-ed711e79b5c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:26 crc kubenswrapper[4651]: E1126 15:03:26.605227 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Nov 26 15:03:26 crc kubenswrapper[4651]: E1126 15:03:26.605735 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w9wbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-8zvlb_openstack-operators(66532d04-3411-4813-ae53-4d635ee98911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:27 crc kubenswrapper[4651]: E1126 15:03:27.222085 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423" Nov 26 15:03:27 crc kubenswrapper[4651]: E1126 15:03:27.222295 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5zkrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-269d2_openstack-operators(a8e49781-2e0b-476d-be9f-e17f05639447): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:28 crc kubenswrapper[4651]: E1126 15:03:28.568449 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c" Nov 26 15:03:28 crc kubenswrapper[4651]: E1126 15:03:28.568683 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e00a9ed0ab26c5b745bd804ab1fe6b22428d026f17ea05a05f045e060342f46c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2kwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6fdcddb789-8h624_openstack-operators(8271ec0d-f8ea-4c46-984f-95572691a379): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:30 crc kubenswrapper[4651]: E1126 15:03:30.328590 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ca332e48d07f932e470177e48dba9332848a1d14c857cff6f9bfb1adc1998482" Nov 26 15:03:30 crc kubenswrapper[4651]: E1126 15:03:30.329149 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ca332e48d07f932e470177e48dba9332848a1d14c857cff6f9bfb1adc1998482,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-znmt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6b7f75547b-k4tq9_openstack-operators(85fb4e98-47db-403d-85e3-c2550cd47160): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:33 crc kubenswrapper[4651]: E1126 15:03:33.030720 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:888edf6f432e52eaa5fc3caeae616fe38a3302b006bbba0e38885b2beba9f0f2" Nov 26 15:03:33 crc kubenswrapper[4651]: E1126 15:03:33.031288 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:888edf6f432e52eaa5fc3caeae616fe38a3302b006bbba0e38885b2beba9f0f2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pmk9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_openstack-operators(8a55643f-68a5-47ea-8b27-db437d3af215): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:34 crc kubenswrapper[4651]: E1126 15:03:34.433854 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7" Nov 26 15:03:34 crc kubenswrapper[4651]: E1126 15:03:34.434472 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hhldx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:35 crc kubenswrapper[4651]: E1126 15:03:35.497648 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4" Nov 26 15:03:35 crc kubenswrapper[4651]: E1126 15:03:35.497829 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p47ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:38 crc kubenswrapper[4651]: E1126 15:03:38.553885 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c" Nov 26 15:03:38 crc kubenswrapper[4651]: E1126 15:03:38.554565 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2dc8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:39 crc kubenswrapper[4651]: E1126 15:03:39.022002 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711" Nov 26 15:03:39 crc kubenswrapper[4651]: E1126 15:03:39.022253 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jhwm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7b4567c7cf-hmndm_openstack-operators(dc5a51cf-b992-4542-8b00-2948ab513eed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:39 crc kubenswrapper[4651]: E1126 15:03:39.496140 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 26 15:03:39 crc kubenswrapper[4651]: E1126 15:03:39.496389 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrz2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wwjsd_openstack-operators(a72e6d14-1571-4b70-b872-a4a4b0b3c242): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:03:39 crc kubenswrapper[4651]: E1126 15:03:39.497560 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:03:39 crc kubenswrapper[4651]: I1126 15:03:39.773397 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:39 crc kubenswrapper[4651]: I1126 15:03:39.781960 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7bc81d-5bbe-4c1b-a512-93e75a1f7035-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9\" (UID: \"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:39 crc kubenswrapper[4651]: I1126 15:03:39.959178 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zdppk" Nov 26 15:03:39 crc kubenswrapper[4651]: I1126 15:03:39.967292 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:40 crc kubenswrapper[4651]: I1126 15:03:40.027456 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-shslt"] Nov 26 15:03:40 crc kubenswrapper[4651]: I1126 15:03:40.034127 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g"] Nov 26 15:03:40 crc kubenswrapper[4651]: W1126 15:03:40.307394 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b3b839_cb5f_4e5e_82b4_cbbb7b18ddb6.slice/crio-3c683ebe54d0f4d12aa3f87dce03881addf6feae96ea41da3b09fb6240d34b7a WatchSource:0}: Error finding container 3c683ebe54d0f4d12aa3f87dce03881addf6feae96ea41da3b09fb6240d34b7a: Status 404 returned error can't find the container with id 3c683ebe54d0f4d12aa3f87dce03881addf6feae96ea41da3b09fb6240d34b7a Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.069276 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerStarted","Data":"e899c242aef2981c2a06f693cd6864e828910b21b4b430bb54f4fa497a21a270"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.076402 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerStarted","Data":"3c683ebe54d0f4d12aa3f87dce03881addf6feae96ea41da3b09fb6240d34b7a"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.078237 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerStarted","Data":"08f68236bdb9a5f9948976152995692a30fb7b140f874c52e1a8df726eab5227"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.087410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerStarted","Data":"19838810e2e215a7b2662e85aec61819aef61f77f3fc2f0bf80f3814a1031970"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.094100 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerStarted","Data":"6ae5b12af8c621d86bc62319fd366dd54ebd9fb95be30c8f8337c1f7390221a1"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.097028 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" event={"ID":"e50a607f-7a61-4a78-870a-297fa0daa977","Type":"ContainerStarted","Data":"c45d8db1ee32fbee85782fa6278a140842f2f14aee9b4287289b6612b0792535"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.097140 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" event={"ID":"e50a607f-7a61-4a78-870a-297fa0daa977","Type":"ContainerStarted","Data":"10d8af5dd2b005c20658cfd00c3505e98027c1dddc2331abd1cad6607288a4bf"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.097176 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.102643 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerStarted","Data":"9d1704816bea93211f441f3fb2009a77565f483b2b8e946797c7c3bff6e497fe"} Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.150198 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" podStartSLOduration=34.15018114 podStartE2EDuration="34.15018114s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:03:41.129683042 +0000 UTC m=+788.555430666" watchObservedRunningTime="2025-11-26 15:03:41.15018114 +0000 UTC m=+788.575928734" Nov 26 15:03:41 crc kubenswrapper[4651]: I1126 15:03:41.348261 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9"] Nov 26 15:03:42 crc kubenswrapper[4651]: I1126 15:03:42.111985 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" event={"ID":"8cd427a2-9759-460e-b86e-23e08dd7ba78","Type":"ContainerStarted","Data":"0b711569ef0ffd586847bc5b97844ac34fdfc34c85d0e549c6958268777db4cd"} Nov 26 15:03:42 crc kubenswrapper[4651]: I1126 15:03:42.113731 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" event={"ID":"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035","Type":"ContainerStarted","Data":"1c8f7270e09b94d7de6b319aee6104864e800a31ab08dcc38a1bbd3b1a10708d"} Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.250133 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" podUID="ec10af15-dcf5-413d-87ef-0ca5a469b5fa" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.286533 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" podUID="e9981be4-751d-4c74-894a-698adad4c50f" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.316974 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" podUID="8271ec0d-f8ea-4c46-984f-95572691a379" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.320974 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podUID="85fb4e98-47db-403d-85e3-c2550cd47160" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.342378 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.345801 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.412944 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" podUID="a8e49781-2e0b-476d-be9f-e17f05639447" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.432221 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" podUID="66532d04-3411-4813-ae53-4d635ee98911" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.693893 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.722258 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" podUID="eed373f0-add9-4ae8-b5cc-ed711e79b5c5" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.858018 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" podUID="8a55643f-68a5-47ea-8b27-db437d3af215" Nov 26 15:03:45 crc kubenswrapper[4651]: E1126 15:03:45.918872 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:03:46 crc kubenswrapper[4651]: E1126 15:03:46.153267 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" podUID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.199957 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" event={"ID":"8cd427a2-9759-460e-b86e-23e08dd7ba78","Type":"ContainerStarted","Data":"da18832d06bce34fda9d5ce475823bf7f788ce0a31006966048df6ab2e832864"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.200148 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.202784 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerStarted","Data":"22ab63486cb591df07e6e4097db5221788f26b72d5723445abc27c56f48e222f"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.203442 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.208597 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.209291 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.215410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerStarted","Data":"a74a3bf77456cef2a5b590c481a21b81824afdcd0b9ebe8b1aebb883531f343a"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.225243 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" podStartSLOduration=3.253036404 podStartE2EDuration="39.225227007s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.476274446 +0000 UTC m=+756.902022050" lastFinishedPulling="2025-11-26 15:03:45.448465049 +0000 UTC m=+792.874212653" observedRunningTime="2025-11-26 15:03:46.220840509 +0000 UTC m=+793.646588123" watchObservedRunningTime="2025-11-26 15:03:46.225227007 +0000 UTC m=+793.650974611" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.229843 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerStarted","Data":"6d37b55b02e8bb103c92868a678377f8d34bb11262291134bd2769a4f05f10e3"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.237112 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerStarted","Data":"750c30a8135797310cf0e15cc7051ba4461f02f22ead529ae5a39066e6cf91b8"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.253548 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerStarted","Data":"87fec591e1f26b66b3a5edec211d98a9590ee5dafc30f6b80f41e16c7a061e10"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.254306 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.260995 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" podStartSLOduration=3.409950989 podStartE2EDuration="39.260974463s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.060392287 +0000 UTC m=+756.486139891" lastFinishedPulling="2025-11-26 15:03:44.911415761 +0000 UTC m=+792.337163365" observedRunningTime="2025-11-26 15:03:46.256878583 +0000 UTC m=+793.682626207" watchObservedRunningTime="2025-11-26 15:03:46.260974463 +0000 UTC m=+793.686722057" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.262259 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.265854 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerStarted","Data":"7c7e6642a0ae27034727fb7d179e589ebcb3648ab06552e54e0b881c83654cfc"} Nov 26 15:03:46 crc kubenswrapper[4651]: E1126 15:03:46.269885 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.271773 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerStarted","Data":"bfdb9564b29f11280726a270c25acc6d60641edbf27c779c3c6757361da5aa31"} Nov 26 15:03:46 crc kubenswrapper[4651]: E1126 15:03:46.276855 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.281650 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerStarted","Data":"f791b0a5b4d00d3f5ee290b1e147d2ad830840b549ab4c4125757c6b516beff8"} Nov 26 15:03:46 crc kubenswrapper[4651]: E1126 15:03:46.287273 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.301727 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerStarted","Data":"0b70597a07bee4b25a7f1cd84f2625d93a84cf3b3907afda43999cf3938e954c"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.319448 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.326990 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.329829 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerStarted","Data":"d1612a2e7ee249cd012ed61af7fb4df9332b38686b98fa797d969cb6c399ed95"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.343131 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerStarted","Data":"79f97a560ed9d2be188aa24ba4c1e3d3753aa346ff2db5cae3e9d817c9c55f1a"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.344158 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.358319 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.364790 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerStarted","Data":"5cc7febf36ea84269f42e4a61092563e014cfc1ba3510f8435632067c70d67aa"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.369312 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerStarted","Data":"48e42299be0257fd1cdfdea59b32b59d8dc5b3a71d523b474e8f4a401fa45a1a"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.389543 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerStarted","Data":"93c0e814aa9a23d7c1b983c16077ea3dc26742f24320b0d0fce5dd4d89823f91"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.428024 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerStarted","Data":"4176fd3a2e14235f41cd6910650e910b38d35288086fe944fca55296268011f0"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.466695 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerStarted","Data":"d60f95baa5d94387ec9b11cbebef464fc3ee8501c0e527c1bf4fbbbb8df4f076"} Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.578343 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" podStartSLOduration=3.082146355 podStartE2EDuration="39.578320467s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.328444354 +0000 UTC m=+756.754191958" lastFinishedPulling="2025-11-26 15:03:45.824618466 +0000 UTC m=+793.250366070" observedRunningTime="2025-11-26 15:03:46.526321416 +0000 UTC m=+793.952069040" watchObservedRunningTime="2025-11-26 15:03:46.578320467 +0000 UTC m=+794.004068071" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.690216 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" podStartSLOduration=3.993932747 podStartE2EDuration="40.690189078s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.527985023 +0000 UTC m=+755.953732627" lastFinishedPulling="2025-11-26 15:03:45.224241354 +0000 UTC m=+792.649988958" observedRunningTime="2025-11-26 15:03:46.67342026 +0000 UTC m=+794.099167874" watchObservedRunningTime="2025-11-26 15:03:46.690189078 +0000 UTC m=+794.115936692" Nov 26 15:03:46 crc kubenswrapper[4651]: I1126 15:03:46.804156 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" podStartSLOduration=4.43878973 podStartE2EDuration="40.804142564s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.569512803 +0000 UTC m=+755.995260407" lastFinishedPulling="2025-11-26 15:03:44.934865637 +0000 UTC m=+792.360613241" observedRunningTime="2025-11-26 15:03:46.763305073 +0000 UTC m=+794.189052697" watchObservedRunningTime="2025-11-26 15:03:46.804142564 +0000 UTC m=+794.229890168" Nov 26 15:03:47 crc kubenswrapper[4651]: I1126 15:03:47.491150 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerStarted","Data":"45b7b6ac5555e665a3848a2f74f52e9299a7ac5652d43806c3819e51a8c66406"} Nov 26 15:03:47 crc kubenswrapper[4651]: I1126 15:03:47.514678 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerStarted","Data":"68a561102b21d18dbd83cb0c39501d64f4f8574185eed53263c06b325bcef754"} Nov 26 15:03:47 crc kubenswrapper[4651]: I1126 15:03:47.515298 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:47 crc kubenswrapper[4651]: E1126 15:03:47.517535 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:25faa5b0e4801d4d3b01a28b877ed3188eee71f33ad66f3c2e86b7921758e711\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:03:47 crc kubenswrapper[4651]: I1126 15:03:47.520695 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:03:47 crc kubenswrapper[4651]: I1126 15:03:47.586055 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podStartSLOduration=4.197805564 podStartE2EDuration="40.586025989s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.462219651 +0000 UTC m=+756.887967255" lastFinishedPulling="2025-11-26 15:03:45.850440076 +0000 UTC m=+793.276187680" observedRunningTime="2025-11-26 15:03:47.557366152 +0000 UTC m=+794.983113776" watchObservedRunningTime="2025-11-26 15:03:47.586025989 +0000 UTC m=+795.011773583" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.527566 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerStarted","Data":"e4e86fbd4a7f1e3ca3c6445da766092d64defe9efc96d1b66e6e2082bc18e9f7"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.527720 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.530111 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerStarted","Data":"66d4e1a8a7464741d6291f866867bf3b5f18546445bfdc1338aff5d4014af56a"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.530293 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.536686 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerStarted","Data":"a71712d16bc040e56e42409d28dfe7133b928d732e60f0bf7fd5948d3351fabe"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.536842 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.539832 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerStarted","Data":"ad887845c585d6e5bcef2dd124a04b5b952a0903b3ca9b4454446fdca2ea51c1"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.540097 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.546735 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerStarted","Data":"09100959ca7bdf7d783741de3afd6c235fa6e921771a8abea9d4cce719368185"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.546925 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.553846 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerStarted","Data":"7c46a1b0f4f43559adce3d1a67d26ec5eaac5f1152a61a729bcdb293d1a406ab"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.553995 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.556211 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" podStartSLOduration=3.67442744 podStartE2EDuration="41.556175967s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.18953863 +0000 UTC m=+756.615286244" lastFinishedPulling="2025-11-26 15:03:47.071287167 +0000 UTC m=+794.497034771" observedRunningTime="2025-11-26 15:03:48.548893001 +0000 UTC m=+795.974640625" watchObservedRunningTime="2025-11-26 15:03:48.556175967 +0000 UTC m=+795.981923571" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.557591 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerStarted","Data":"3dfdb12a088d907a58eb0d8f09b3f0260add56c0bdb3a616b591972ca4cc60e3"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.558363 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.569002 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerStarted","Data":"254bb7226198214e8b2d36818fe4295bed20fbe71567f6521ae1043594715684"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.569700 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.581485 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerStarted","Data":"29d6967888c7f1e1b4e4d2dd10230d66cb68af8b05e13bbbd4da168847ceb8ad"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.581620 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.590364 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerStarted","Data":"5f4df4abc8cfc206de00d39d6be18c73e05ea9a7a2a6343dce8f6c26d5729dbf"} Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.590412 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.593694 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podStartSLOduration=3.971607489 podStartE2EDuration="42.593669869s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.493552952 +0000 UTC m=+755.919300556" lastFinishedPulling="2025-11-26 15:03:47.115615332 +0000 UTC m=+794.541362936" observedRunningTime="2025-11-26 15:03:48.589185559 +0000 UTC m=+796.014933163" watchObservedRunningTime="2025-11-26 15:03:48.593669869 +0000 UTC m=+796.019417473" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.615500 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" podStartSLOduration=3.291210183 podStartE2EDuration="41.615478462s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.766610472 +0000 UTC m=+756.192358076" lastFinishedPulling="2025-11-26 15:03:47.090878751 +0000 UTC m=+794.516626355" observedRunningTime="2025-11-26 15:03:48.611163527 +0000 UTC m=+796.036911131" watchObservedRunningTime="2025-11-26 15:03:48.615478462 +0000 UTC m=+796.041226066" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.629937 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" podStartSLOduration=3.698377054 podStartE2EDuration="42.629920838s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.352796979 +0000 UTC m=+755.778544583" lastFinishedPulling="2025-11-26 15:03:47.284340763 +0000 UTC m=+794.710088367" observedRunningTime="2025-11-26 15:03:48.629527838 +0000 UTC m=+796.055275442" watchObservedRunningTime="2025-11-26 15:03:48.629920838 +0000 UTC m=+796.055668442" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.671411 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" podStartSLOduration=3.93367009 podStartE2EDuration="41.671392477s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.355083576 +0000 UTC m=+756.780831180" lastFinishedPulling="2025-11-26 15:03:47.092805963 +0000 UTC m=+794.518553567" observedRunningTime="2025-11-26 15:03:48.670873783 +0000 UTC m=+796.096621387" watchObservedRunningTime="2025-11-26 15:03:48.671392477 +0000 UTC m=+796.097140091" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.735913 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podStartSLOduration=3.677947604 podStartE2EDuration="41.735894181s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.057216522 +0000 UTC m=+756.482964126" lastFinishedPulling="2025-11-26 15:03:47.115163099 +0000 UTC m=+794.540910703" observedRunningTime="2025-11-26 15:03:48.709520057 +0000 UTC m=+796.135267691" watchObservedRunningTime="2025-11-26 15:03:48.735894181 +0000 UTC m=+796.161641795" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.737558 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" podStartSLOduration=3.841785745 podStartE2EDuration="41.737548236s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.355528928 +0000 UTC m=+756.781276532" lastFinishedPulling="2025-11-26 15:03:47.251291419 +0000 UTC m=+794.677039023" observedRunningTime="2025-11-26 15:03:48.733255591 +0000 UTC m=+796.159003225" watchObservedRunningTime="2025-11-26 15:03:48.737548236 +0000 UTC m=+796.163295840" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.767457 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" podStartSLOduration=4.049312532 podStartE2EDuration="41.767434104s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.355460846 +0000 UTC m=+756.781208450" lastFinishedPulling="2025-11-26 15:03:47.073582428 +0000 UTC m=+794.499330022" observedRunningTime="2025-11-26 15:03:48.763169691 +0000 UTC m=+796.188917325" watchObservedRunningTime="2025-11-26 15:03:48.767434104 +0000 UTC m=+796.193181708" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.790264 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" podStartSLOduration=4.173879813 podStartE2EDuration="41.790243094s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.457215687 +0000 UTC m=+756.882963291" lastFinishedPulling="2025-11-26 15:03:47.073578978 +0000 UTC m=+794.499326572" observedRunningTime="2025-11-26 15:03:48.788401945 +0000 UTC m=+796.214149559" watchObservedRunningTime="2025-11-26 15:03:48.790243094 +0000 UTC m=+796.215990708" Nov 26 15:03:48 crc kubenswrapper[4651]: I1126 15:03:48.812285 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" podStartSLOduration=5.044421472 podStartE2EDuration="42.812261763s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.312556869 +0000 UTC m=+756.738304473" lastFinishedPulling="2025-11-26 15:03:47.08039716 +0000 UTC m=+794.506144764" observedRunningTime="2025-11-26 15:03:48.812212202 +0000 UTC m=+796.237959816" watchObservedRunningTime="2025-11-26 15:03:48.812261763 +0000 UTC m=+796.238009377" Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.604591 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" event={"ID":"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035","Type":"ContainerStarted","Data":"a0b98aa6738a112301787477518204e7e34082286f8bbe28d18d40d8ae5b394f"} Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.604899 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" event={"ID":"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035","Type":"ContainerStarted","Data":"9306a8b04c2d0a0d6f9de98d6be8ed6001c3035b9773ed5831a0c8b5c465adf2"} Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.604917 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.606265 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerStarted","Data":"da5d213f45eec9d1b54e683a55955c6c841456b1efc177e8c3990df9fdd337ac"} Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.606316 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerStarted","Data":"8836958b25c08ba398d2599e613f6fe57f79fd51839e0bed6314f94ea6d1b99c"} Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.628353 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" podStartSLOduration=35.383966247 podStartE2EDuration="43.628336218s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:41.692960982 +0000 UTC m=+789.118708586" lastFinishedPulling="2025-11-26 15:03:49.937330953 +0000 UTC m=+797.363078557" observedRunningTime="2025-11-26 15:03:50.627604968 +0000 UTC m=+798.053352592" watchObservedRunningTime="2025-11-26 15:03:50.628336218 +0000 UTC m=+798.054083822" Nov 26 15:03:50 crc kubenswrapper[4651]: I1126 15:03:50.664370 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" podStartSLOduration=35.052768046 podStartE2EDuration="44.66434689s" podCreationTimestamp="2025-11-26 15:03:06 +0000 UTC" firstStartedPulling="2025-11-26 15:03:40.325182134 +0000 UTC m=+787.750929728" lastFinishedPulling="2025-11-26 15:03:49.936760968 +0000 UTC m=+797.362508572" observedRunningTime="2025-11-26 15:03:50.653247603 +0000 UTC m=+798.078995227" watchObservedRunningTime="2025-11-26 15:03:50.66434689 +0000 UTC m=+798.090094494" Nov 26 15:03:51 crc kubenswrapper[4651]: E1126 15:03:51.403660 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:03:51 crc kubenswrapper[4651]: I1126 15:03:51.612330 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:03:54 crc kubenswrapper[4651]: I1126 15:03:54.020232 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.215356 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.232184 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.443256 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.487152 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.642628 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.696457 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.697716 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:03:57 crc kubenswrapper[4651]: I1126 15:03:57.746588 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:03:58 crc kubenswrapper[4651]: I1126 15:03:58.081731 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:03:58 crc kubenswrapper[4651]: I1126 15:03:58.368473 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:03:59 crc kubenswrapper[4651]: I1126 15:03:59.974100 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:04:01 crc kubenswrapper[4651]: I1126 15:04:01.680410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerStarted","Data":"d68457aa0182b7b472e788b09f60b13e7f5345e7a22b5bc1daa6ec33f2b70b24"} Nov 26 15:04:01 crc kubenswrapper[4651]: I1126 15:04:01.680934 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:04:01 crc kubenswrapper[4651]: I1126 15:04:01.697442 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podStartSLOduration=3.292044547 podStartE2EDuration="54.69742546s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.496412515 +0000 UTC m=+756.922160119" lastFinishedPulling="2025-11-26 15:04:00.901793418 +0000 UTC m=+808.327541032" observedRunningTime="2025-11-26 15:04:01.696876455 +0000 UTC m=+809.122624079" watchObservedRunningTime="2025-11-26 15:04:01.69742546 +0000 UTC m=+809.123173064" Nov 26 15:04:02 crc kubenswrapper[4651]: I1126 15:04:02.691133 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerStarted","Data":"4aec82603b80b561de33cfb0129cc4a117978fa847ac8eb77ffb74b4e2c43db9"} Nov 26 15:04:02 crc kubenswrapper[4651]: I1126 15:04:02.691695 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:04:02 crc kubenswrapper[4651]: I1126 15:04:02.713481 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podStartSLOduration=3.351022794 podStartE2EDuration="55.713463925s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.505060326 +0000 UTC m=+756.930807930" lastFinishedPulling="2025-11-26 15:04:01.867501457 +0000 UTC m=+809.293249061" observedRunningTime="2025-11-26 15:04:02.707003211 +0000 UTC m=+810.132750825" watchObservedRunningTime="2025-11-26 15:04:02.713463925 +0000 UTC m=+810.139211529" Nov 26 15:04:03 crc kubenswrapper[4651]: I1126 15:04:03.062709 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:04:03 crc kubenswrapper[4651]: I1126 15:04:03.699908 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerStarted","Data":"0104f495a20a880cdd317a6da9eb4040c59e5ada8c866b537adbb1edf8e011b1"} Nov 26 15:04:03 crc kubenswrapper[4651]: I1126 15:04:03.715061 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podStartSLOduration=3.123001726 podStartE2EDuration="56.715008081s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:09.511371554 +0000 UTC m=+756.937119158" lastFinishedPulling="2025-11-26 15:04:03.103377909 +0000 UTC m=+810.529125513" observedRunningTime="2025-11-26 15:04:03.712466483 +0000 UTC m=+811.138214127" watchObservedRunningTime="2025-11-26 15:04:03.715008081 +0000 UTC m=+811.140755685" Nov 26 15:04:04 crc kubenswrapper[4651]: I1126 15:04:04.708856 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerStarted","Data":"6a8080f48adc920ab614914aeb492288994ad89d16d849f7fd0e64a5bc233d94"} Nov 26 15:04:04 crc kubenswrapper[4651]: I1126 15:04:04.709402 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:04:04 crc kubenswrapper[4651]: I1126 15:04:04.728733 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podStartSLOduration=2.513506222 podStartE2EDuration="57.728716664s" podCreationTimestamp="2025-11-26 15:03:07 +0000 UTC" firstStartedPulling="2025-11-26 15:03:08.759254396 +0000 UTC m=+756.185002000" lastFinishedPulling="2025-11-26 15:04:03.974464838 +0000 UTC m=+811.400212442" observedRunningTime="2025-11-26 15:04:04.725369504 +0000 UTC m=+812.151117128" watchObservedRunningTime="2025-11-26 15:04:04.728716664 +0000 UTC m=+812.154464268" Nov 26 15:04:08 crc kubenswrapper[4651]: I1126 15:04:08.183604 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:04:08 crc kubenswrapper[4651]: I1126 15:04:08.183950 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:04:17 crc kubenswrapper[4651]: I1126 15:04:17.466475 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.227315 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.229124 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.231426 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.235289 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rdrk6" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.240287 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.241103 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.245028 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.305115 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.306214 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.308238 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.317867 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.404492 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.404547 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w76v\" (UniqueName: \"kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.404626 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prhvf\" (UniqueName: \"kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.404648 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.404702 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.505433 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prhvf\" (UniqueName: \"kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.505470 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.505556 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.505577 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.505608 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w76v\" (UniqueName: \"kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.506340 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.506440 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.506652 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.523598 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prhvf\" (UniqueName: \"kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf\") pod \"dnsmasq-dns-675f4bcbfc-bssfk\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.524188 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w76v\" (UniqueName: \"kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v\") pod \"dnsmasq-dns-78dd6ddcc-tqr4l\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.545662 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:04:40 crc kubenswrapper[4651]: I1126 15:04:40.622675 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:04:41 crc kubenswrapper[4651]: I1126 15:04:41.043025 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:04:41 crc kubenswrapper[4651]: I1126 15:04:41.046742 4651 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:04:41 crc kubenswrapper[4651]: I1126 15:04:41.109080 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:04:41 crc kubenswrapper[4651]: W1126 15:04:41.112355 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f5b89a_277a_4612_b47e_6edd56e43226.slice/crio-c2d4ae8e6f06d32c929f749076559856506684bc2a54b5378ba4bd433de2ebd5 WatchSource:0}: Error finding container c2d4ae8e6f06d32c929f749076559856506684bc2a54b5378ba4bd433de2ebd5: Status 404 returned error can't find the container with id c2d4ae8e6f06d32c929f749076559856506684bc2a54b5378ba4bd433de2ebd5 Nov 26 15:04:41 crc kubenswrapper[4651]: I1126 15:04:41.984393 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" event={"ID":"a644fe47-8779-4a82-b1b1-1364500cd340","Type":"ContainerStarted","Data":"146a42d2831e4f0ba5b2ca40174e878d7ee49e634145ee326f4f6e322fd10f1b"} Nov 26 15:04:41 crc kubenswrapper[4651]: I1126 15:04:41.985950 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" event={"ID":"79f5b89a-277a-4612-b47e-6edd56e43226","Type":"ContainerStarted","Data":"c2d4ae8e6f06d32c929f749076559856506684bc2a54b5378ba4bd433de2ebd5"} Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.541688 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.576988 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.578251 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.587910 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.741925 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vcb\" (UniqueName: \"kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.741990 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.742050 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.842802 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vcb\" (UniqueName: \"kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.842847 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.842883 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.844178 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.845142 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.870417 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vcb\" (UniqueName: \"kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb\") pod \"dnsmasq-dns-666b6646f7-7fbgb\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.901751 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.908423 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.931062 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.932489 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:42 crc kubenswrapper[4651]: I1126 15:04:42.953976 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.046851 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcv9v\" (UniqueName: \"kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.046940 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.046969 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.148501 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.148569 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcv9v\" (UniqueName: \"kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.148637 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.149412 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.149437 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.195077 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcv9v\" (UniqueName: \"kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v\") pod \"dnsmasq-dns-57d769cc4f-wqn6m\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.282364 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.356002 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:04:43 crc kubenswrapper[4651]: W1126 15:04:43.398768 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod554bf1dd_041e_4f64_bd95_548210d76b0c.slice/crio-7a1c4cdbc4699788cd0db636aa486d2730e3eae9b2f24f5c8d3ee293d080c29b WatchSource:0}: Error finding container 7a1c4cdbc4699788cd0db636aa486d2730e3eae9b2f24f5c8d3ee293d080c29b: Status 404 returned error can't find the container with id 7a1c4cdbc4699788cd0db636aa486d2730e3eae9b2f24f5c8d3ee293d080c29b Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.745805 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.773181 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.773340 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776246 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776299 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776405 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776523 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776690 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rdhrz" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.776893 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.779582 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.855808 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:04:43 crc kubenswrapper[4651]: W1126 15:04:43.882958 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0fa55c_de9a_4cde_8602_2ac0086c6528.slice/crio-548dfb589387e98679693393cb9fc07ba4ef2c7ae14fdc95e246c0c38357bb1d WatchSource:0}: Error finding container 548dfb589387e98679693393cb9fc07ba4ef2c7ae14fdc95e246c0c38357bb1d: Status 404 returned error can't find the container with id 548dfb589387e98679693393cb9fc07ba4ef2c7ae14fdc95e246c0c38357bb1d Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.883829 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.883881 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.883897 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.883925 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqcc\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-kube-api-access-dqqcc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.883981 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884013 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884069 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884091 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f351a70-5e04-4270-b9bb-00586a94da1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884110 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884152 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f351a70-5e04-4270-b9bb-00586a94da1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.884201 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985108 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985150 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f351a70-5e04-4270-b9bb-00586a94da1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985181 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985223 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985241 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985263 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985292 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqcc\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-kube-api-access-dqqcc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985328 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985347 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985376 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.985398 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f351a70-5e04-4270-b9bb-00586a94da1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.986452 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.987441 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.988111 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.988203 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.988234 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f351a70-5e04-4270-b9bb-00586a94da1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.990354 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f351a70-5e04-4270-b9bb-00586a94da1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.991302 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.991769 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:43 crc kubenswrapper[4651]: I1126 15:04:43.993475 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.003058 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f351a70-5e04-4270-b9bb-00586a94da1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.004942 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqcc\" (UniqueName: \"kubernetes.io/projected/8f351a70-5e04-4270-b9bb-00586a94da1f-kube-api-access-dqqcc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.036214 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"8f351a70-5e04-4270-b9bb-00586a94da1f\") " pod="openstack/rabbitmq-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.048175 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" event={"ID":"1b0fa55c-de9a-4cde-8602-2ac0086c6528","Type":"ContainerStarted","Data":"548dfb589387e98679693393cb9fc07ba4ef2c7ae14fdc95e246c0c38357bb1d"} Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.049745 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" event={"ID":"554bf1dd-041e-4f64-bd95-548210d76b0c","Type":"ContainerStarted","Data":"7a1c4cdbc4699788cd0db636aa486d2730e3eae9b2f24f5c8d3ee293d080c29b"} Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.077913 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.079290 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.082898 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.083176 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.083622 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mdhqs" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.083868 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.084055 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.084686 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.084691 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.100876 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.105357 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188245 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188605 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc026e6-8f32-45d0-bab4-c12dd93d946f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188632 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188650 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188674 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc026e6-8f32-45d0-bab4-c12dd93d946f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188693 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188711 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188730 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188779 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188818 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.188907 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69s27\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-kube-api-access-69s27\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.290876 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc026e6-8f32-45d0-bab4-c12dd93d946f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.290932 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.290948 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.290973 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc026e6-8f32-45d0-bab4-c12dd93d946f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291003 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291019 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291147 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291232 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291254 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291283 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69s27\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-kube-api-access-69s27\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.291359 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.293079 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.294707 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.295747 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.296191 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.297957 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.302553 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.302668 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.303265 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc026e6-8f32-45d0-bab4-c12dd93d946f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.305127 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc026e6-8f32-45d0-bab4-c12dd93d946f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.324674 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69s27\" (UniqueName: \"kubernetes.io/projected/4fc026e6-8f32-45d0-bab4-c12dd93d946f-kube-api-access-69s27\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.333419 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc026e6-8f32-45d0-bab4-c12dd93d946f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.351885 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc026e6-8f32-45d0-bab4-c12dd93d946f\") " pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.430169 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:04:44 crc kubenswrapper[4651]: I1126 15:04:44.715584 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.064446 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f351a70-5e04-4270-b9bb-00586a94da1f","Type":"ContainerStarted","Data":"badb872a007d431f2433ce56eac7eba2665f21990a46f6febd48c2736ba1bf4b"} Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.211449 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.372376 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.374511 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.385941 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.386201 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.386256 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bwqr5" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.392598 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.395395 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419691 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419764 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-kolla-config\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419799 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419825 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvdr\" (UniqueName: \"kubernetes.io/projected/fcd2f469-d922-4c5d-a885-517ad214a748-kube-api-access-bbvdr\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419897 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.419978 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.420015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-default\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.420058 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.422973 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521469 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521606 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521653 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-default\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521686 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521776 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521818 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-kolla-config\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521854 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.521886 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvdr\" (UniqueName: \"kubernetes.io/projected/fcd2f469-d922-4c5d-a885-517ad214a748-kube-api-access-bbvdr\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.522640 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.526383 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.527157 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-config-data-default\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.532477 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-kolla-config\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.533546 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcd2f469-d922-4c5d-a885-517ad214a748-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.535247 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.547920 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd2f469-d922-4c5d-a885-517ad214a748-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.548782 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvdr\" (UniqueName: \"kubernetes.io/projected/fcd2f469-d922-4c5d-a885-517ad214a748-kube-api-access-bbvdr\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.574400 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"fcd2f469-d922-4c5d-a885-517ad214a748\") " pod="openstack/openstack-galera-0" Nov 26 15:04:45 crc kubenswrapper[4651]: I1126 15:04:45.709949 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.095198 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc026e6-8f32-45d0-bab4-c12dd93d946f","Type":"ContainerStarted","Data":"255545edbed144d3764b77688869a40864bf4ca687881bb41d87fb86c6fe6b14"} Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.384684 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 26 15:04:46 crc kubenswrapper[4651]: W1126 15:04:46.411779 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd2f469_d922_4c5d_a885_517ad214a748.slice/crio-598647d6c01540b65c983d4fed7b67af535b33c61d8dcb98cfcaad873903b711 WatchSource:0}: Error finding container 598647d6c01540b65c983d4fed7b67af535b33c61d8dcb98cfcaad873903b711: Status 404 returned error can't find the container with id 598647d6c01540b65c983d4fed7b67af535b33c61d8dcb98cfcaad873903b711 Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.704270 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.710515 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.710645 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.714645 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cm4sk" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.714995 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.722453 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.722626 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.848895 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.848948 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss52\" (UniqueName: \"kubernetes.io/projected/fded8231-49dd-41d6-8e30-85572ad226db-kube-api-access-fss52\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.848974 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.849006 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.849126 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.849146 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.849166 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fded8231-49dd-41d6-8e30-85572ad226db-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.849189 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.866117 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.868616 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.878798 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g2k9k" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.880295 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.880493 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.880602 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951011 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951088 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fded8231-49dd-41d6-8e30-85572ad226db-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951125 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951179 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951221 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fss52\" (UniqueName: \"kubernetes.io/projected/fded8231-49dd-41d6-8e30-85572ad226db-kube-api-access-fss52\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951256 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951315 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.951387 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.953540 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fded8231-49dd-41d6-8e30-85572ad226db-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.953949 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.954245 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.955683 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.960814 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fded8231-49dd-41d6-8e30-85572ad226db-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.965667 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:46 crc kubenswrapper[4651]: I1126 15:04:46.981848 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded8231-49dd-41d6-8e30-85572ad226db-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.002908 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.011329 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss52\" (UniqueName: \"kubernetes.io/projected/fded8231-49dd-41d6-8e30-85572ad226db-kube-api-access-fss52\") pod \"openstack-cell1-galera-0\" (UID: \"fded8231-49dd-41d6-8e30-85572ad226db\") " pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.053010 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-config-data\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.053172 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.053191 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-kolla-config\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.053273 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.053299 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrw5\" (UniqueName: \"kubernetes.io/projected/31f56fef-ac96-4560-b99c-71b77bcecd4b-kube-api-access-4rrw5\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.062485 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.155165 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrw5\" (UniqueName: \"kubernetes.io/projected/31f56fef-ac96-4560-b99c-71b77bcecd4b-kube-api-access-4rrw5\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.155234 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-config-data\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.155257 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.155280 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-kolla-config\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.155388 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.161182 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.162066 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-config-data\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.163840 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31f56fef-ac96-4560-b99c-71b77bcecd4b-kolla-config\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.171077 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fcd2f469-d922-4c5d-a885-517ad214a748","Type":"ContainerStarted","Data":"598647d6c01540b65c983d4fed7b67af535b33c61d8dcb98cfcaad873903b711"} Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.171341 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f56fef-ac96-4560-b99c-71b77bcecd4b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.184734 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrw5\" (UniqueName: \"kubernetes.io/projected/31f56fef-ac96-4560-b99c-71b77bcecd4b-kube-api-access-4rrw5\") pod \"memcached-0\" (UID: \"31f56fef-ac96-4560-b99c-71b77bcecd4b\") " pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.211591 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.805583 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 26 15:04:47 crc kubenswrapper[4651]: W1126 15:04:47.834454 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfded8231_49dd_41d6_8e30_85572ad226db.slice/crio-40a6453086f8039c2c5ef105318590201a09c49fa7d2ca894b262431856fea39 WatchSource:0}: Error finding container 40a6453086f8039c2c5ef105318590201a09c49fa7d2ca894b262431856fea39: Status 404 returned error can't find the container with id 40a6453086f8039c2c5ef105318590201a09c49fa7d2ca894b262431856fea39 Nov 26 15:04:47 crc kubenswrapper[4651]: I1126 15:04:47.921380 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 26 15:04:47 crc kubenswrapper[4651]: W1126 15:04:47.933762 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f56fef_ac96_4560_b99c_71b77bcecd4b.slice/crio-01b09d28f9f41dd3b43577fb16d16d04f7c0a05662a60ce5ef20b170cb932bfd WatchSource:0}: Error finding container 01b09d28f9f41dd3b43577fb16d16d04f7c0a05662a60ce5ef20b170cb932bfd: Status 404 returned error can't find the container with id 01b09d28f9f41dd3b43577fb16d16d04f7c0a05662a60ce5ef20b170cb932bfd Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.262363 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"31f56fef-ac96-4560-b99c-71b77bcecd4b","Type":"ContainerStarted","Data":"01b09d28f9f41dd3b43577fb16d16d04f7c0a05662a60ce5ef20b170cb932bfd"} Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.267352 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fded8231-49dd-41d6-8e30-85572ad226db","Type":"ContainerStarted","Data":"40a6453086f8039c2c5ef105318590201a09c49fa7d2ca894b262431856fea39"} Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.868790 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.869931 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.874987 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bkc2d" Nov 26 15:04:48 crc kubenswrapper[4651]: I1126 15:04:48.880365 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:04:49 crc kubenswrapper[4651]: I1126 15:04:49.001183 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c\") pod \"kube-state-metrics-0\" (UID: \"5e9fabd8-6c8f-4ff7-960c-29c3105073b5\") " pod="openstack/kube-state-metrics-0" Nov 26 15:04:49 crc kubenswrapper[4651]: I1126 15:04:49.103819 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c\") pod \"kube-state-metrics-0\" (UID: \"5e9fabd8-6c8f-4ff7-960c-29c3105073b5\") " pod="openstack/kube-state-metrics-0" Nov 26 15:04:49 crc kubenswrapper[4651]: I1126 15:04:49.121830 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c\") pod \"kube-state-metrics-0\" (UID: \"5e9fabd8-6c8f-4ff7-960c-29c3105073b5\") " pod="openstack/kube-state-metrics-0" Nov 26 15:04:49 crc kubenswrapper[4651]: I1126 15:04:49.213714 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:04:50 crc kubenswrapper[4651]: I1126 15:04:50.197122 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.456606 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zrhdf"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.457951 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.460601 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.467756 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.467986 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-qqpd8" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.486128 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4hsfq"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.492985 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.550425 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zrhdf"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.590592 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4hsfq"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.606191 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.607427 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615374 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615419 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-log-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615468 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-log\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615495 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-run\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615537 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-etc-ovs\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615569 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-lib\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615587 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e2255f-1898-48ec-b534-24907368820d-scripts\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615613 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-scripts\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615639 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-ovn-controller-tls-certs\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615670 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zthl\" (UniqueName: \"kubernetes.io/projected/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-kube-api-access-8zthl\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615693 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-combined-ca-bundle\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615711 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrt25\" (UniqueName: \"kubernetes.io/projected/e1e2255f-1898-48ec-b534-24907368820d-kube-api-access-jrt25\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.615742 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.617356 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dkbnm" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.618117 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.618795 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.619020 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.623168 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.663451 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.717299 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.719730 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zthl\" (UniqueName: \"kubernetes.io/projected/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-kube-api-access-8zthl\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.719931 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-combined-ca-bundle\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720050 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrt25\" (UniqueName: \"kubernetes.io/projected/e1e2255f-1898-48ec-b534-24907368820d-kube-api-access-jrt25\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720190 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720302 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720423 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720516 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720610 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-log-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720715 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.720802 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmwz5\" (UniqueName: \"kubernetes.io/projected/ce029284-0f6b-4827-9831-6c9b6b5cec58-kube-api-access-jmwz5\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721060 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-log\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721168 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-run\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721288 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721370 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-etc-ovs\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721454 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721540 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-lib\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721617 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e2255f-1898-48ec-b534-24907368820d-scripts\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721692 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721739 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.721830 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-scripts\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.724711 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-ovn-controller-tls-certs\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.722076 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-run-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.722387 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-var-log-ovn\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.722462 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-log\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.723395 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-lib\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.724608 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-scripts\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.725243 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-var-run\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.722217 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e1e2255f-1898-48ec-b534-24907368820d-etc-ovs\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.726784 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-combined-ca-bundle\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.730454 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-ovn-controller-tls-certs\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.737169 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1e2255f-1898-48ec-b534-24907368820d-scripts\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.739073 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zthl\" (UniqueName: \"kubernetes.io/projected/13f26ce1-fcd6-47bf-b95d-d93e41dd795f-kube-api-access-8zthl\") pod \"ovn-controller-zrhdf\" (UID: \"13f26ce1-fcd6-47bf-b95d-d93e41dd795f\") " pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.739097 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrt25\" (UniqueName: \"kubernetes.io/projected/e1e2255f-1898-48ec-b534-24907368820d-kube-api-access-jrt25\") pod \"ovn-controller-ovs-4hsfq\" (UID: \"e1e2255f-1898-48ec-b534-24907368820d\") " pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.828869 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.829360 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.829754 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.832207 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.837746 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.847402 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.847716 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.848294 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmwz5\" (UniqueName: \"kubernetes.io/projected/ce029284-0f6b-4827-9831-6c9b6b5cec58-kube-api-access-jmwz5\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.848784 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.849227 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.849349 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.848263 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.850360 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce029284-0f6b-4827-9831-6c9b6b5cec58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.853523 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.853913 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce029284-0f6b-4827-9831-6c9b6b5cec58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.860799 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.868615 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmwz5\" (UniqueName: \"kubernetes.io/projected/ce029284-0f6b-4827-9831-6c9b6b5cec58-kube-api-access-jmwz5\") pod \"ovsdbserver-nb-0\" (UID: \"ce029284-0f6b-4827-9831-6c9b6b5cec58\") " pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.869353 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.902509 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:04:53 crc kubenswrapper[4651]: I1126 15:04:53.937352 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.047635 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.049088 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.053765 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nj5mh" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.054057 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.054077 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.054234 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.064569 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197335 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197399 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197424 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197461 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197484 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197501 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d41f859-b430-492d-856b-e623f18f5df9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197534 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.197581 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwklm\" (UniqueName: \"kubernetes.io/projected/1d41f859-b430-492d-856b-e623f18f5df9-kube-api-access-mwklm\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299657 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwklm\" (UniqueName: \"kubernetes.io/projected/1d41f859-b430-492d-856b-e623f18f5df9-kube-api-access-mwklm\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299767 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299809 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299838 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299872 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299894 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.299964 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d41f859-b430-492d-856b-e623f18f5df9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.300006 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.301910 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.302331 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d41f859-b430-492d-856b-e623f18f5df9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.303487 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.306270 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d41f859-b430-492d-856b-e623f18f5df9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.309119 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.309976 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.310760 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d41f859-b430-492d-856b-e623f18f5df9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.331173 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwklm\" (UniqueName: \"kubernetes.io/projected/1d41f859-b430-492d-856b-e623f18f5df9-kube-api-access-mwklm\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.345713 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d41f859-b430-492d-856b-e623f18f5df9\") " pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: I1126 15:04:56.375985 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 26 15:04:56 crc kubenswrapper[4651]: W1126 15:04:56.781488 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e9fabd8_6c8f_4ff7_960c_29c3105073b5.slice/crio-6523803621b129fb26d0c52b4b33d0bd076170d7103200f45f7936687e0b0065 WatchSource:0}: Error finding container 6523803621b129fb26d0c52b4b33d0bd076170d7103200f45f7936687e0b0065: Status 404 returned error can't find the container with id 6523803621b129fb26d0c52b4b33d0bd076170d7103200f45f7936687e0b0065 Nov 26 15:04:57 crc kubenswrapper[4651]: I1126 15:04:57.416235 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e9fabd8-6c8f-4ff7-960c-29c3105073b5","Type":"ContainerStarted","Data":"6523803621b129fb26d0c52b4b33d0bd076170d7103200f45f7936687e0b0065"} Nov 26 15:04:59 crc kubenswrapper[4651]: I1126 15:04:59.132660 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:04:59 crc kubenswrapper[4651]: I1126 15:04:59.133000 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:05:07 crc kubenswrapper[4651]: E1126 15:05:07.996205 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 26 15:05:07 crc kubenswrapper[4651]: E1126 15:05:07.996906 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fss52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(fded8231-49dd-41d6-8e30-85572ad226db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:07 crc kubenswrapper[4651]: E1126 15:05:07.998356 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="fded8231-49dd-41d6-8e30-85572ad226db" Nov 26 15:05:08 crc kubenswrapper[4651]: E1126 15:05:08.829883 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="fded8231-49dd-41d6-8e30-85572ad226db" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.166830 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.167290 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-69s27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(4fc026e6-8f32-45d0-bab4-c12dd93d946f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.168930 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="4fc026e6-8f32-45d0-bab4-c12dd93d946f" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.188927 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.189126 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bbvdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(fcd2f469-d922-4c5d-a885-517ad214a748): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.190308 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="fcd2f469-d922-4c5d-a885-517ad214a748" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.843659 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="4fc026e6-8f32-45d0-bab4-c12dd93d946f" Nov 26 15:05:09 crc kubenswrapper[4651]: E1126 15:05:09.843750 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="fcd2f469-d922-4c5d-a885-517ad214a748" Nov 26 15:05:13 crc kubenswrapper[4651]: E1126 15:05:13.463393 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 26 15:05:13 crc kubenswrapper[4651]: E1126 15:05:13.464316 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dqqcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(8f351a70-5e04-4270-b9bb-00586a94da1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:13 crc kubenswrapper[4651]: E1126 15:05:13.465613 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="8f351a70-5e04-4270-b9bb-00586a94da1f" Nov 26 15:05:13 crc kubenswrapper[4651]: E1126 15:05:13.875007 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="8f351a70-5e04-4270-b9bb-00586a94da1f" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.196420 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.196913 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5f6h67dh65ch697h597h56bh55h67hf6h58h674hfh5bfh698hdbh647h5ffh67dh59chdbh67h8bh669h577hdbhb7h676h686h57dh577h646h5dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rrw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(31f56fef-ac96-4560-b99c-71b77bcecd4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.198291 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="31f56fef-ac96-4560-b99c-71b77bcecd4b" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.894815 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="31f56fef-ac96-4560-b99c-71b77bcecd4b" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.983432 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.983613 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prhvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bssfk_openstack(a644fe47-8779-4a82-b1b1-1364500cd340): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.986429 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" podUID="a644fe47-8779-4a82-b1b1-1364500cd340" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.992626 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.992774 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w76v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tqr4l_openstack(79f5b89a-277a-4612-b47e-6edd56e43226): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:14 crc kubenswrapper[4651]: E1126 15:05:14.993940 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" podUID="79f5b89a-277a-4612-b47e-6edd56e43226" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.031434 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.031603 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67vcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-7fbgb_openstack(554bf1dd-041e-4f64-bd95-548210d76b0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.033028 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" podUID="554bf1dd-041e-4f64-bd95-548210d76b0c" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.044303 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.044416 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcv9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-wqn6m_openstack(1b0fa55c-de9a-4cde-8602-2ac0086c6528): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.045604 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" podUID="1b0fa55c-de9a-4cde-8602-2ac0086c6528" Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.091277 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 26 15:05:15 crc kubenswrapper[4651]: W1126 15:05:15.124745 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce029284_0f6b_4827_9831_6c9b6b5cec58.slice/crio-00843027f13c5bae85f647d6ceeb2368510e27f454568bc7ce42bb687d52bef5 WatchSource:0}: Error finding container 00843027f13c5bae85f647d6ceeb2368510e27f454568bc7ce42bb687d52bef5: Status 404 returned error can't find the container with id 00843027f13c5bae85f647d6ceeb2368510e27f454568bc7ce42bb687d52bef5 Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.629557 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zrhdf"] Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.657610 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 26 15:05:15 crc kubenswrapper[4651]: W1126 15:05:15.749922 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f26ce1_fcd6_47bf_b95d_d93e41dd795f.slice/crio-b9293b46ebde9fe6b5433c28c9f5af4c672cce06bb4b1f87d8c8cfd9a3c2084d WatchSource:0}: Error finding container b9293b46ebde9fe6b5433c28c9f5af4c672cce06bb4b1f87d8c8cfd9a3c2084d: Status 404 returned error can't find the container with id b9293b46ebde9fe6b5433c28c9f5af4c672cce06bb4b1f87d8c8cfd9a3c2084d Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.764445 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4hsfq"] Nov 26 15:05:15 crc kubenswrapper[4651]: W1126 15:05:15.815119 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d41f859_b430_492d_856b_e623f18f5df9.slice/crio-c821d2a0b211879e0b747fd952bb7fd881a4c466fe564c1b9de20d16fed88678 WatchSource:0}: Error finding container c821d2a0b211879e0b747fd952bb7fd881a4c466fe564c1b9de20d16fed88678: Status 404 returned error can't find the container with id c821d2a0b211879e0b747fd952bb7fd881a4c466fe564c1b9de20d16fed88678 Nov 26 15:05:15 crc kubenswrapper[4651]: W1126 15:05:15.869713 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e2255f_1898_48ec_b534_24907368820d.slice/crio-5bd0c38cfc7fba310a8701be662f3caa7ae1602fe13047d831f58ac02adcb12f WatchSource:0}: Error finding container 5bd0c38cfc7fba310a8701be662f3caa7ae1602fe13047d831f58ac02adcb12f: Status 404 returned error can't find the container with id 5bd0c38cfc7fba310a8701be662f3caa7ae1602fe13047d831f58ac02adcb12f Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.889260 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zrhdf" event={"ID":"13f26ce1-fcd6-47bf-b95d-d93e41dd795f","Type":"ContainerStarted","Data":"b9293b46ebde9fe6b5433c28c9f5af4c672cce06bb4b1f87d8c8cfd9a3c2084d"} Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.890328 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce029284-0f6b-4827-9831-6c9b6b5cec58","Type":"ContainerStarted","Data":"00843027f13c5bae85f647d6ceeb2368510e27f454568bc7ce42bb687d52bef5"} Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.891213 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4hsfq" event={"ID":"e1e2255f-1898-48ec-b534-24907368820d","Type":"ContainerStarted","Data":"5bd0c38cfc7fba310a8701be662f3caa7ae1602fe13047d831f58ac02adcb12f"} Nov 26 15:05:15 crc kubenswrapper[4651]: I1126 15:05:15.892846 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d41f859-b430-492d-856b-e623f18f5df9","Type":"ContainerStarted","Data":"c821d2a0b211879e0b747fd952bb7fd881a4c466fe564c1b9de20d16fed88678"} Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.894364 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" podUID="1b0fa55c-de9a-4cde-8602-2ac0086c6528" Nov 26 15:05:15 crc kubenswrapper[4651]: E1126 15:05:15.897793 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" podUID="554bf1dd-041e-4f64-bd95-548210d76b0c" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.333472 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.339642 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.510752 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prhvf\" (UniqueName: \"kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf\") pod \"a644fe47-8779-4a82-b1b1-1364500cd340\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.510852 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config\") pod \"79f5b89a-277a-4612-b47e-6edd56e43226\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.510877 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config\") pod \"a644fe47-8779-4a82-b1b1-1364500cd340\" (UID: \"a644fe47-8779-4a82-b1b1-1364500cd340\") " Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.510906 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w76v\" (UniqueName: \"kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v\") pod \"79f5b89a-277a-4612-b47e-6edd56e43226\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.510961 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc\") pod \"79f5b89a-277a-4612-b47e-6edd56e43226\" (UID: \"79f5b89a-277a-4612-b47e-6edd56e43226\") " Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.512180 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79f5b89a-277a-4612-b47e-6edd56e43226" (UID: "79f5b89a-277a-4612-b47e-6edd56e43226"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.512362 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config" (OuterVolumeSpecName: "config") pod "a644fe47-8779-4a82-b1b1-1364500cd340" (UID: "a644fe47-8779-4a82-b1b1-1364500cd340"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.512955 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config" (OuterVolumeSpecName: "config") pod "79f5b89a-277a-4612-b47e-6edd56e43226" (UID: "79f5b89a-277a-4612-b47e-6edd56e43226"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.528321 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf" (OuterVolumeSpecName: "kube-api-access-prhvf") pod "a644fe47-8779-4a82-b1b1-1364500cd340" (UID: "a644fe47-8779-4a82-b1b1-1364500cd340"). InnerVolumeSpecName "kube-api-access-prhvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.528902 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v" (OuterVolumeSpecName: "kube-api-access-9w76v") pod "79f5b89a-277a-4612-b47e-6edd56e43226" (UID: "79f5b89a-277a-4612-b47e-6edd56e43226"). InnerVolumeSpecName "kube-api-access-9w76v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.612913 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.612975 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a644fe47-8779-4a82-b1b1-1364500cd340-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.612988 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w76v\" (UniqueName: \"kubernetes.io/projected/79f5b89a-277a-4612-b47e-6edd56e43226-kube-api-access-9w76v\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.613028 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f5b89a-277a-4612-b47e-6edd56e43226-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.613056 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prhvf\" (UniqueName: \"kubernetes.io/projected/a644fe47-8779-4a82-b1b1-1364500cd340-kube-api-access-prhvf\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.674096 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bp6gp"] Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.681967 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.684893 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bp6gp"] Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.685247 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.811582 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817185 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee616f8e-429e-4224-90d8-7757d8f52ebd-config\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817231 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817312 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vxw\" (UniqueName: \"kubernetes.io/projected/ee616f8e-429e-4224-90d8-7757d8f52ebd-kube-api-access-t9vxw\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817354 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovn-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817402 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovs-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.817468 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.843757 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.845387 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.854705 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.869854 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921566 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee616f8e-429e-4224-90d8-7757d8f52ebd-config\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921634 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921754 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vxw\" (UniqueName: \"kubernetes.io/projected/ee616f8e-429e-4224-90d8-7757d8f52ebd-kube-api-access-t9vxw\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921801 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovn-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921861 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovs-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.921932 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.924367 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee616f8e-429e-4224-90d8-7757d8f52ebd-config\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.925392 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovn-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.925457 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ee616f8e-429e-4224-90d8-7757d8f52ebd-ovs-rundir\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.935556 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.968481 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.968488 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tqr4l" event={"ID":"79f5b89a-277a-4612-b47e-6edd56e43226","Type":"ContainerDied","Data":"c2d4ae8e6f06d32c929f749076559856506684bc2a54b5378ba4bd433de2ebd5"} Nov 26 15:05:16 crc kubenswrapper[4651]: I1126 15:05:16.981388 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee616f8e-429e-4224-90d8-7757d8f52ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.020319 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.021097 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bssfk" event={"ID":"a644fe47-8779-4a82-b1b1-1364500cd340","Type":"ContainerDied","Data":"146a42d2831e4f0ba5b2ca40174e878d7ee49e634145ee326f4f6e322fd10f1b"} Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.031806 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.031914 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdcpb\" (UniqueName: \"kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.031939 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.031965 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.040822 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vxw\" (UniqueName: \"kubernetes.io/projected/ee616f8e-429e-4224-90d8-7757d8f52ebd-kube-api-access-t9vxw\") pod \"ovn-controller-metrics-bp6gp\" (UID: \"ee616f8e-429e-4224-90d8-7757d8f52ebd\") " pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.135942 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdcpb\" (UniqueName: \"kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.136001 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.136080 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.136214 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.137195 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.137750 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.138131 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.176487 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdcpb\" (UniqueName: \"kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb\") pod \"dnsmasq-dns-7fd796d7df-nmx8m\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.176873 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.211391 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.219183 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tqr4l"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.294727 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.310750 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bp6gp" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.319075 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bssfk"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.328182 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.440446 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f5b89a-277a-4612-b47e-6edd56e43226" path="/var/lib/kubelet/pods/79f5b89a-277a-4612-b47e-6edd56e43226/volumes" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.440870 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a644fe47-8779-4a82-b1b1-1364500cd340" path="/var/lib/kubelet/pods/a644fe47-8779-4a82-b1b1-1364500cd340/volumes" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.441266 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.444872 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.451276 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.497526 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.547770 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.547807 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.547937 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26sw\" (UniqueName: \"kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.548068 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.548087 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.649663 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.649729 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.649803 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.649822 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.649883 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26sw\" (UniqueName: \"kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.650869 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.652163 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.652614 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.652676 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.670800 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26sw\" (UniqueName: \"kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw\") pod \"dnsmasq-dns-86db49b7ff-zchf4\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.788336 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.791744 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.929564 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.955965 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config\") pod \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.956481 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config" (OuterVolumeSpecName: "config") pod "1b0fa55c-de9a-4cde-8602-2ac0086c6528" (UID: "1b0fa55c-de9a-4cde-8602-2ac0086c6528"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.956502 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc\") pod \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.956673 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcv9v\" (UniqueName: \"kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v\") pod \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\" (UID: \"1b0fa55c-de9a-4cde-8602-2ac0086c6528\") " Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.957174 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.957447 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b0fa55c-de9a-4cde-8602-2ac0086c6528" (UID: "1b0fa55c-de9a-4cde-8602-2ac0086c6528"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:17 crc kubenswrapper[4651]: I1126 15:05:17.961068 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v" (OuterVolumeSpecName: "kube-api-access-bcv9v") pod "1b0fa55c-de9a-4cde-8602-2ac0086c6528" (UID: "1b0fa55c-de9a-4cde-8602-2ac0086c6528"). InnerVolumeSpecName "kube-api-access-bcv9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.031639 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" event={"ID":"554bf1dd-041e-4f64-bd95-548210d76b0c","Type":"ContainerDied","Data":"7a1c4cdbc4699788cd0db636aa486d2730e3eae9b2f24f5c8d3ee293d080c29b"} Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.031729 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7fbgb" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.039980 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" event={"ID":"1b0fa55c-de9a-4cde-8602-2ac0086c6528","Type":"ContainerDied","Data":"548dfb589387e98679693393cb9fc07ba4ef2c7ae14fdc95e246c0c38357bb1d"} Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.040088 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wqn6m" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.058079 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config\") pod \"554bf1dd-041e-4f64-bd95-548210d76b0c\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.058122 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc\") pod \"554bf1dd-041e-4f64-bd95-548210d76b0c\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.058938 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67vcb\" (UniqueName: \"kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb\") pod \"554bf1dd-041e-4f64-bd95-548210d76b0c\" (UID: \"554bf1dd-041e-4f64-bd95-548210d76b0c\") " Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.059539 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b0fa55c-de9a-4cde-8602-2ac0086c6528-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.059559 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcv9v\" (UniqueName: \"kubernetes.io/projected/1b0fa55c-de9a-4cde-8602-2ac0086c6528-kube-api-access-bcv9v\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.060612 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config" (OuterVolumeSpecName: "config") pod "554bf1dd-041e-4f64-bd95-548210d76b0c" (UID: "554bf1dd-041e-4f64-bd95-548210d76b0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.061191 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "554bf1dd-041e-4f64-bd95-548210d76b0c" (UID: "554bf1dd-041e-4f64-bd95-548210d76b0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.062813 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb" (OuterVolumeSpecName: "kube-api-access-67vcb") pod "554bf1dd-041e-4f64-bd95-548210d76b0c" (UID: "554bf1dd-041e-4f64-bd95-548210d76b0c"). InnerVolumeSpecName "kube-api-access-67vcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:18 crc kubenswrapper[4651]: W1126 15:05:18.100283 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372153a6_95cf_4708_9b85_0537b4556633.slice/crio-0c12043f30045b8121f7c65e847d8d92c6c7924d3e670460a7581c0599376d6d WatchSource:0}: Error finding container 0c12043f30045b8121f7c65e847d8d92c6c7924d3e670460a7581c0599376d6d: Status 404 returned error can't find the container with id 0c12043f30045b8121f7c65e847d8d92c6c7924d3e670460a7581c0599376d6d Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.104540 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.115855 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bp6gp"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.161075 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67vcb\" (UniqueName: \"kubernetes.io/projected/554bf1dd-041e-4f64-bd95-548210d76b0c-kube-api-access-67vcb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.161116 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.161127 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/554bf1dd-041e-4f64-bd95-548210d76b0c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.166091 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.172213 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wqn6m"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.388476 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.393646 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7fbgb"] Nov 26 15:05:18 crc kubenswrapper[4651]: I1126 15:05:18.437346 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:19 crc kubenswrapper[4651]: I1126 15:05:19.048453 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" event={"ID":"372153a6-95cf-4708-9b85-0537b4556633","Type":"ContainerStarted","Data":"0c12043f30045b8121f7c65e847d8d92c6c7924d3e670460a7581c0599376d6d"} Nov 26 15:05:19 crc kubenswrapper[4651]: I1126 15:05:19.049780 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" event={"ID":"e0877833-64b2-4651-afcd-17ec0d3b4a44","Type":"ContainerStarted","Data":"b3c1637d930c47f0be6cfaf110d52a32753705689b984c0a5b843ead4afce696"} Nov 26 15:05:19 crc kubenswrapper[4651]: I1126 15:05:19.051062 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bp6gp" event={"ID":"ee616f8e-429e-4224-90d8-7757d8f52ebd","Type":"ContainerStarted","Data":"c258cc1bfbfcd1520825877d82ba03c191e35eb0b6e02fd870fedb912288b2e9"} Nov 26 15:05:19 crc kubenswrapper[4651]: I1126 15:05:19.411507 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0fa55c-de9a-4cde-8602-2ac0086c6528" path="/var/lib/kubelet/pods/1b0fa55c-de9a-4cde-8602-2ac0086c6528/volumes" Nov 26 15:05:19 crc kubenswrapper[4651]: I1126 15:05:19.411954 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554bf1dd-041e-4f64-bd95-548210d76b0c" path="/var/lib/kubelet/pods/554bf1dd-041e-4f64-bd95-548210d76b0c/volumes" Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.076243 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fcd2f469-d922-4c5d-a885-517ad214a748","Type":"ContainerStarted","Data":"04343b60f1af71c8a270fbee25425e6bcf82471287701780fa3930476737ec6c"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.083849 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zrhdf" event={"ID":"13f26ce1-fcd6-47bf-b95d-d93e41dd795f","Type":"ContainerStarted","Data":"ae0af85bb41281a2b1606ea3cc27942ba2e94b588ec7af0d1f7dfd56f1493c00"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.085183 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zrhdf" Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.087530 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce029284-0f6b-4827-9831-6c9b6b5cec58","Type":"ContainerStarted","Data":"ab107b68264bdbce827ec76c1fb1b123a0923d70d7e752c42c9e33f3b58b17d1"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.094565 4651 generic.go:334] "Generic (PLEG): container finished" podID="372153a6-95cf-4708-9b85-0537b4556633" containerID="9aff03679f0a5c47e3cb5eec96106d3087130dd4344e4719987c106e9f58403b" exitCode=0 Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.094827 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" event={"ID":"372153a6-95cf-4708-9b85-0537b4556633","Type":"ContainerDied","Data":"9aff03679f0a5c47e3cb5eec96106d3087130dd4344e4719987c106e9f58403b"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.104281 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e9fabd8-6c8f-4ff7-960c-29c3105073b5","Type":"ContainerStarted","Data":"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.104421 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.107810 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4hsfq" event={"ID":"e1e2255f-1898-48ec-b534-24907368820d","Type":"ContainerStarted","Data":"846fa2534dd7181fce391c05fb044aaf2bc4ca395f66980b32b7ddc79a2a7a2f"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.112655 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d41f859-b430-492d-856b-e623f18f5df9","Type":"ContainerStarted","Data":"df34a31dc1c2c31f442a4a8705d4172a8013d0af1ac5a5f4817cda6dda40102c"} Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.138917 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zrhdf" podStartSLOduration=23.372219811 podStartE2EDuration="29.138899613s" podCreationTimestamp="2025-11-26 15:04:53 +0000 UTC" firstStartedPulling="2025-11-26 15:05:15.757060661 +0000 UTC m=+883.182808265" lastFinishedPulling="2025-11-26 15:05:21.523740463 +0000 UTC m=+888.949488067" observedRunningTime="2025-11-26 15:05:22.132238171 +0000 UTC m=+889.557985775" watchObservedRunningTime="2025-11-26 15:05:22.138899613 +0000 UTC m=+889.564647217" Nov 26 15:05:22 crc kubenswrapper[4651]: I1126 15:05:22.178479 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.527107573 podStartE2EDuration="34.1784612s" podCreationTimestamp="2025-11-26 15:04:48 +0000 UTC" firstStartedPulling="2025-11-26 15:04:56.784556395 +0000 UTC m=+864.210303999" lastFinishedPulling="2025-11-26 15:05:21.435910022 +0000 UTC m=+888.861657626" observedRunningTime="2025-11-26 15:05:22.159879724 +0000 UTC m=+889.585627328" watchObservedRunningTime="2025-11-26 15:05:22.1784612 +0000 UTC m=+889.604208804" Nov 26 15:05:23 crc kubenswrapper[4651]: I1126 15:05:23.123133 4651 generic.go:334] "Generic (PLEG): container finished" podID="e1e2255f-1898-48ec-b534-24907368820d" containerID="846fa2534dd7181fce391c05fb044aaf2bc4ca395f66980b32b7ddc79a2a7a2f" exitCode=0 Nov 26 15:05:23 crc kubenswrapper[4651]: I1126 15:05:23.123267 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4hsfq" event={"ID":"e1e2255f-1898-48ec-b534-24907368820d","Type":"ContainerDied","Data":"846fa2534dd7181fce391c05fb044aaf2bc4ca395f66980b32b7ddc79a2a7a2f"} Nov 26 15:05:24 crc kubenswrapper[4651]: I1126 15:05:24.157173 4651 generic.go:334] "Generic (PLEG): container finished" podID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerID="aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9" exitCode=0 Nov 26 15:05:24 crc kubenswrapper[4651]: I1126 15:05:24.157721 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" event={"ID":"e0877833-64b2-4651-afcd-17ec0d3b4a44","Type":"ContainerDied","Data":"aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9"} Nov 26 15:05:24 crc kubenswrapper[4651]: I1126 15:05:24.164164 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4hsfq" event={"ID":"e1e2255f-1898-48ec-b534-24907368820d","Type":"ContainerStarted","Data":"0e187412d43e51e88f8671796e41f0e7ae96e6ad530e740762b7c892d19b7010"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.174866 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d41f859-b430-492d-856b-e623f18f5df9","Type":"ContainerStarted","Data":"4931963ae45a8d6d4450086fc07c3d10eb5567c06695a49bcde0f831042c231c"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.179223 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bp6gp" event={"ID":"ee616f8e-429e-4224-90d8-7757d8f52ebd","Type":"ContainerStarted","Data":"22eabaec1d52d2dc9c84121de3f7dbf9e07aab0175e20631c8d0fdbcfaf432d3"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.185870 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fded8231-49dd-41d6-8e30-85572ad226db","Type":"ContainerStarted","Data":"d78fef925a3c6bbc2fe98b6d9958f19513b482c7746d86cd51ef00bc61f61ef6"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.188701 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce029284-0f6b-4827-9831-6c9b6b5cec58","Type":"ContainerStarted","Data":"ebcc850defa39d5e0eaf9b312a953d08a085654470db703399b49d237eacf488"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.193327 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" event={"ID":"372153a6-95cf-4708-9b85-0537b4556633","Type":"ContainerStarted","Data":"ed4ef1b6b971116d84874706cdba03b4649f1770825ed0f2a16914147c73a128"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.193488 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.196962 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" event={"ID":"e0877833-64b2-4651-afcd-17ec0d3b4a44","Type":"ContainerStarted","Data":"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.197015 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.200885 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4hsfq" event={"ID":"e1e2255f-1898-48ec-b534-24907368820d","Type":"ContainerStarted","Data":"a48855ae60de67aae87bac233c79fd9f8c61dca75482e40614008e5d53381c73"} Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.201292 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.139269656 podStartE2EDuration="30.201270655s" podCreationTimestamp="2025-11-26 15:04:55 +0000 UTC" firstStartedPulling="2025-11-26 15:05:15.817370763 +0000 UTC m=+883.243118367" lastFinishedPulling="2025-11-26 15:05:23.879371762 +0000 UTC m=+891.305119366" observedRunningTime="2025-11-26 15:05:25.195432437 +0000 UTC m=+892.621180051" watchObservedRunningTime="2025-11-26 15:05:25.201270655 +0000 UTC m=+892.627018259" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.201640 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.201667 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.228158 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.383072476 podStartE2EDuration="33.228140587s" podCreationTimestamp="2025-11-26 15:04:52 +0000 UTC" firstStartedPulling="2025-11-26 15:05:15.128590889 +0000 UTC m=+882.554338493" lastFinishedPulling="2025-11-26 15:05:23.973659 +0000 UTC m=+891.399406604" observedRunningTime="2025-11-26 15:05:25.225871295 +0000 UTC m=+892.651618919" watchObservedRunningTime="2025-11-26 15:05:25.228140587 +0000 UTC m=+892.653888191" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.247607 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" podStartSLOduration=5.837827336 podStartE2EDuration="9.247583776s" podCreationTimestamp="2025-11-26 15:05:16 +0000 UTC" firstStartedPulling="2025-11-26 15:05:18.103382635 +0000 UTC m=+885.529130239" lastFinishedPulling="2025-11-26 15:05:21.513139075 +0000 UTC m=+888.938886679" observedRunningTime="2025-11-26 15:05:25.244085401 +0000 UTC m=+892.669833015" watchObservedRunningTime="2025-11-26 15:05:25.247583776 +0000 UTC m=+892.673331380" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.280752 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bp6gp" podStartSLOduration=3.527319067 podStartE2EDuration="9.280736339s" podCreationTimestamp="2025-11-26 15:05:16 +0000 UTC" firstStartedPulling="2025-11-26 15:05:18.10793347 +0000 UTC m=+885.533681074" lastFinishedPulling="2025-11-26 15:05:23.861350742 +0000 UTC m=+891.287098346" observedRunningTime="2025-11-26 15:05:25.28004836 +0000 UTC m=+892.705795964" watchObservedRunningTime="2025-11-26 15:05:25.280736339 +0000 UTC m=+892.706483943" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.342763 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" podStartSLOduration=5.270868677 podStartE2EDuration="8.342740117s" podCreationTimestamp="2025-11-26 15:05:17 +0000 UTC" firstStartedPulling="2025-11-26 15:05:18.455558994 +0000 UTC m=+885.881306598" lastFinishedPulling="2025-11-26 15:05:21.527430434 +0000 UTC m=+888.953178038" observedRunningTime="2025-11-26 15:05:25.326461544 +0000 UTC m=+892.752209158" watchObservedRunningTime="2025-11-26 15:05:25.342740117 +0000 UTC m=+892.768487721" Nov 26 15:05:25 crc kubenswrapper[4651]: I1126 15:05:25.365080 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4hsfq" podStartSLOduration=26.812472382 podStartE2EDuration="32.365060135s" podCreationTimestamp="2025-11-26 15:04:53 +0000 UTC" firstStartedPulling="2025-11-26 15:05:15.87233363 +0000 UTC m=+883.298081234" lastFinishedPulling="2025-11-26 15:05:21.424921373 +0000 UTC m=+888.850668987" observedRunningTime="2025-11-26 15:05:25.359736489 +0000 UTC m=+892.785484103" watchObservedRunningTime="2025-11-26 15:05:25.365060135 +0000 UTC m=+892.790807749" Nov 26 15:05:26 crc kubenswrapper[4651]: I1126 15:05:26.376508 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 26 15:05:26 crc kubenswrapper[4651]: I1126 15:05:26.376781 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 26 15:05:26 crc kubenswrapper[4651]: I1126 15:05:26.468373 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 26 15:05:26 crc kubenswrapper[4651]: I1126 15:05:26.937982 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 26 15:05:26 crc kubenswrapper[4651]: I1126 15:05:26.979439 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.215169 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.256217 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.257975 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.537951 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.540403 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.546585 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.546851 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.547000 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.547002 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8555n" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.634749 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-config\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.634829 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsk8\" (UniqueName: \"kubernetes.io/projected/3b709538-7306-4705-aafe-d0d20e151610-kube-api-access-kqsk8\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.634923 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b709538-7306-4705-aafe-d0d20e151610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.634941 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.634998 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-scripts\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.635022 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.635350 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.639110 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741123 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-scripts\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741175 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741223 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741257 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-config\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741272 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsk8\" (UniqueName: \"kubernetes.io/projected/3b709538-7306-4705-aafe-d0d20e151610-kube-api-access-kqsk8\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741317 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b709538-7306-4705-aafe-d0d20e151610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.741332 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.742135 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-scripts\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.742524 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b709538-7306-4705-aafe-d0d20e151610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.743086 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b709538-7306-4705-aafe-d0d20e151610-config\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.746621 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.746853 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.748408 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b709538-7306-4705-aafe-d0d20e151610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.764710 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsk8\" (UniqueName: \"kubernetes.io/projected/3b709538-7306-4705-aafe-d0d20e151610-kube-api-access-kqsk8\") pod \"ovn-northd-0\" (UID: \"3b709538-7306-4705-aafe-d0d20e151610\") " pod="openstack/ovn-northd-0" Nov 26 15:05:27 crc kubenswrapper[4651]: I1126 15:05:27.859753 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 26 15:05:28 crc kubenswrapper[4651]: I1126 15:05:28.305322 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 26 15:05:29 crc kubenswrapper[4651]: I1126 15:05:29.132622 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:05:29 crc kubenswrapper[4651]: I1126 15:05:29.132706 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:05:29 crc kubenswrapper[4651]: I1126 15:05:29.219842 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 15:05:29 crc kubenswrapper[4651]: I1126 15:05:29.229725 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3b709538-7306-4705-aafe-d0d20e151610","Type":"ContainerStarted","Data":"f26924464027f151658c87b6a94d8e35b4e3803a2246b6396274d0f49b4b6826"} Nov 26 15:05:29 crc kubenswrapper[4651]: I1126 15:05:29.231453 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc026e6-8f32-45d0-bab4-c12dd93d946f","Type":"ContainerStarted","Data":"f714b1dc5c1abbf0d0a21f31b86e43500b5361442439a451dbe49beb89edb9ef"} Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.241409 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f351a70-5e04-4270-b9bb-00586a94da1f","Type":"ContainerStarted","Data":"a0186b2a528014a344cc52586bde2af9796297e5acadc7664ee8093e61ffa401"} Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.248750 4651 generic.go:334] "Generic (PLEG): container finished" podID="fcd2f469-d922-4c5d-a885-517ad214a748" containerID="04343b60f1af71c8a270fbee25425e6bcf82471287701780fa3930476737ec6c" exitCode=0 Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.248947 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fcd2f469-d922-4c5d-a885-517ad214a748","Type":"ContainerDied","Data":"04343b60f1af71c8a270fbee25425e6bcf82471287701780fa3930476737ec6c"} Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.251720 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3b709538-7306-4705-aafe-d0d20e151610","Type":"ContainerStarted","Data":"26e288777ac553d183afafff0cb0f5f0c1df52524e170ee849471788a85f0325"} Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.257757 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"31f56fef-ac96-4560-b99c-71b77bcecd4b","Type":"ContainerStarted","Data":"66b8c7a95532d6dd61b438e9c34992da28829a6bf19d561ac7beff2bf500af03"} Nov 26 15:05:30 crc kubenswrapper[4651]: I1126 15:05:30.257974 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.268503 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fcd2f469-d922-4c5d-a885-517ad214a748","Type":"ContainerStarted","Data":"e32912c50d62356d617a700e98c53c2cb944b8f55a5e79385262bef9b1b09676"} Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.271283 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3b709538-7306-4705-aafe-d0d20e151610","Type":"ContainerStarted","Data":"90b7734403e59b2ec4146e8fe0d1e8376853c86fa8d4dbe4cb5d7e30b31b6358"} Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.271408 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.297711 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.187685865 podStartE2EDuration="47.297689746s" podCreationTimestamp="2025-11-26 15:04:44 +0000 UTC" firstStartedPulling="2025-11-26 15:04:46.41363636 +0000 UTC m=+853.839383964" lastFinishedPulling="2025-11-26 15:05:21.523640241 +0000 UTC m=+888.949387845" observedRunningTime="2025-11-26 15:05:31.288763933 +0000 UTC m=+898.714511597" watchObservedRunningTime="2025-11-26 15:05:31.297689746 +0000 UTC m=+898.723437360" Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.300212 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.265432877 podStartE2EDuration="45.300183444s" podCreationTimestamp="2025-11-26 15:04:46 +0000 UTC" firstStartedPulling="2025-11-26 15:04:47.9484907 +0000 UTC m=+855.374238294" lastFinishedPulling="2025-11-26 15:05:29.983241257 +0000 UTC m=+897.408988861" observedRunningTime="2025-11-26 15:05:30.333334289 +0000 UTC m=+897.759081903" watchObservedRunningTime="2025-11-26 15:05:31.300183444 +0000 UTC m=+898.725931118" Nov 26 15:05:31 crc kubenswrapper[4651]: I1126 15:05:31.322617 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.6610435040000002 podStartE2EDuration="4.322593574s" podCreationTimestamp="2025-11-26 15:05:27 +0000 UTC" firstStartedPulling="2025-11-26 15:05:28.320907826 +0000 UTC m=+895.746655430" lastFinishedPulling="2025-11-26 15:05:29.982457896 +0000 UTC m=+897.408205500" observedRunningTime="2025-11-26 15:05:31.319213973 +0000 UTC m=+898.744961577" watchObservedRunningTime="2025-11-26 15:05:31.322593574 +0000 UTC m=+898.748341208" Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.179323 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.283436 4651 generic.go:334] "Generic (PLEG): container finished" podID="fded8231-49dd-41d6-8e30-85572ad226db" containerID="d78fef925a3c6bbc2fe98b6d9958f19513b482c7746d86cd51ef00bc61f61ef6" exitCode=0 Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.283513 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fded8231-49dd-41d6-8e30-85572ad226db","Type":"ContainerDied","Data":"d78fef925a3c6bbc2fe98b6d9958f19513b482c7746d86cd51ef00bc61f61ef6"} Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.789753 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.849072 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:32 crc kubenswrapper[4651]: I1126 15:05:32.849293 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="dnsmasq-dns" containerID="cri-o://ed4ef1b6b971116d84874706cdba03b4649f1770825ed0f2a16914147c73a128" gracePeriod=10 Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.293168 4651 generic.go:334] "Generic (PLEG): container finished" podID="372153a6-95cf-4708-9b85-0537b4556633" containerID="ed4ef1b6b971116d84874706cdba03b4649f1770825ed0f2a16914147c73a128" exitCode=0 Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.293566 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" event={"ID":"372153a6-95cf-4708-9b85-0537b4556633","Type":"ContainerDied","Data":"ed4ef1b6b971116d84874706cdba03b4649f1770825ed0f2a16914147c73a128"} Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.300343 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fded8231-49dd-41d6-8e30-85572ad226db","Type":"ContainerStarted","Data":"cb36706fc35723ce25cda8afedc2cb3c0a811b6c8e6452d6b75ff78649f9c172"} Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.331557 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=12.24454382 podStartE2EDuration="48.331537143s" podCreationTimestamp="2025-11-26 15:04:45 +0000 UTC" firstStartedPulling="2025-11-26 15:04:47.841524818 +0000 UTC m=+855.267272442" lastFinishedPulling="2025-11-26 15:05:23.928518161 +0000 UTC m=+891.354265765" observedRunningTime="2025-11-26 15:05:33.325785176 +0000 UTC m=+900.751532780" watchObservedRunningTime="2025-11-26 15:05:33.331537143 +0000 UTC m=+900.757284747" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.505998 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.645899 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc\") pod \"372153a6-95cf-4708-9b85-0537b4556633\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.645971 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb\") pod \"372153a6-95cf-4708-9b85-0537b4556633\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.645998 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config\") pod \"372153a6-95cf-4708-9b85-0537b4556633\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.646051 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdcpb\" (UniqueName: \"kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb\") pod \"372153a6-95cf-4708-9b85-0537b4556633\" (UID: \"372153a6-95cf-4708-9b85-0537b4556633\") " Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.652554 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb" (OuterVolumeSpecName: "kube-api-access-vdcpb") pod "372153a6-95cf-4708-9b85-0537b4556633" (UID: "372153a6-95cf-4708-9b85-0537b4556633"). InnerVolumeSpecName "kube-api-access-vdcpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.706524 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "372153a6-95cf-4708-9b85-0537b4556633" (UID: "372153a6-95cf-4708-9b85-0537b4556633"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.716963 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "372153a6-95cf-4708-9b85-0537b4556633" (UID: "372153a6-95cf-4708-9b85-0537b4556633"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.717715 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config" (OuterVolumeSpecName: "config") pod "372153a6-95cf-4708-9b85-0537b4556633" (UID: "372153a6-95cf-4708-9b85-0537b4556633"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.748444 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.748608 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.748705 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/372153a6-95cf-4708-9b85-0537b4556633-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:33 crc kubenswrapper[4651]: I1126 15:05:33.748780 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdcpb\" (UniqueName: \"kubernetes.io/projected/372153a6-95cf-4708-9b85-0537b4556633-kube-api-access-vdcpb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.310636 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" event={"ID":"372153a6-95cf-4708-9b85-0537b4556633","Type":"ContainerDied","Data":"0c12043f30045b8121f7c65e847d8d92c6c7924d3e670460a7581c0599376d6d"} Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.310680 4651 scope.go:117] "RemoveContainer" containerID="ed4ef1b6b971116d84874706cdba03b4649f1770825ed0f2a16914147c73a128" Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.310702 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-nmx8m" Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.348063 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.352358 4651 scope.go:117] "RemoveContainer" containerID="9aff03679f0a5c47e3cb5eec96106d3087130dd4344e4719987c106e9f58403b" Nov 26 15:05:34 crc kubenswrapper[4651]: I1126 15:05:34.354857 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-nmx8m"] Nov 26 15:05:35 crc kubenswrapper[4651]: I1126 15:05:35.416916 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372153a6-95cf-4708-9b85-0537b4556633" path="/var/lib/kubelet/pods/372153a6-95cf-4708-9b85-0537b4556633/volumes" Nov 26 15:05:35 crc kubenswrapper[4651]: I1126 15:05:35.715733 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 26 15:05:35 crc kubenswrapper[4651]: I1126 15:05:35.715806 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.069237 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.069490 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.213683 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.283312 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.412148 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 26 15:05:37 crc kubenswrapper[4651]: I1126 15:05:37.940264 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 26 15:05:38 crc kubenswrapper[4651]: I1126 15:05:38.020011 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.287756 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:05:39 crc kubenswrapper[4651]: E1126 15:05:39.288349 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="dnsmasq-dns" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.288361 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="dnsmasq-dns" Nov 26 15:05:39 crc kubenswrapper[4651]: E1126 15:05:39.288398 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="init" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.288406 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="init" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.288535 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="372153a6-95cf-4708-9b85-0537b4556633" containerName="dnsmasq-dns" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.289359 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.304523 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.371286 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.371349 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfwb\" (UniqueName: \"kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.371368 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.371388 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.371427 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.473678 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.473765 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfwb\" (UniqueName: \"kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.473783 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.473810 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.473862 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.474639 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.475330 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.475499 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.475618 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.497977 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfwb\" (UniqueName: \"kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb\") pod \"dnsmasq-dns-698758b865-4ggsw\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:39 crc kubenswrapper[4651]: I1126 15:05:39.614826 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.140439 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:05:40 crc kubenswrapper[4651]: W1126 15:05:40.149184 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d9303a_3724_4f7c_90b2_ef9ba8b92200.slice/crio-7a9b122af4309b44dc0fa3136e89615c8c9291be490a1ef9d7591b3d2b29dfd5 WatchSource:0}: Error finding container 7a9b122af4309b44dc0fa3136e89615c8c9291be490a1ef9d7591b3d2b29dfd5: Status 404 returned error can't find the container with id 7a9b122af4309b44dc0fa3136e89615c8c9291be490a1ef9d7591b3d2b29dfd5 Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.357119 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerStarted","Data":"cf2111f8328d8ff60aad4296092443239b09e224079a5b1f3cd30a51431b1f50"} Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.357180 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerStarted","Data":"7a9b122af4309b44dc0fa3136e89615c8c9291be490a1ef9d7591b3d2b29dfd5"} Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.404260 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.412681 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.415941 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.416166 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.416242 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.416331 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d4kwd" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.429952 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.496609 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.496881 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zld4f\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-kube-api-access-zld4f\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.496999 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-cache\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.497181 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-lock\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.497276 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.598344 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-lock\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.598698 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.598848 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.598981 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zld4f\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-kube-api-access-zld4f\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.599117 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-cache\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.598920 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.599054 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-lock\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: E1126 15:05:40.599023 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.599428 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-cache\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: E1126 15:05:40.599442 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:40 crc kubenswrapper[4651]: E1126 15:05:40.599487 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:05:41.099470482 +0000 UTC m=+908.525218086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.624266 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:40 crc kubenswrapper[4651]: I1126 15:05:40.639454 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zld4f\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-kube-api-access-zld4f\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:41 crc kubenswrapper[4651]: I1126 15:05:41.106748 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:41 crc kubenswrapper[4651]: E1126 15:05:41.107094 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:41 crc kubenswrapper[4651]: E1126 15:05:41.107117 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:41 crc kubenswrapper[4651]: E1126 15:05:41.107168 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:05:42.107154555 +0000 UTC m=+909.532902149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:41 crc kubenswrapper[4651]: I1126 15:05:41.363628 4651 generic.go:334] "Generic (PLEG): container finished" podID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerID="cf2111f8328d8ff60aad4296092443239b09e224079a5b1f3cd30a51431b1f50" exitCode=0 Nov 26 15:05:41 crc kubenswrapper[4651]: I1126 15:05:41.363904 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerDied","Data":"cf2111f8328d8ff60aad4296092443239b09e224079a5b1f3cd30a51431b1f50"} Nov 26 15:05:42 crc kubenswrapper[4651]: I1126 15:05:42.126928 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:42 crc kubenswrapper[4651]: E1126 15:05:42.127143 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:42 crc kubenswrapper[4651]: E1126 15:05:42.127308 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:42 crc kubenswrapper[4651]: E1126 15:05:42.127354 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:05:44.127339382 +0000 UTC m=+911.553086986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:42 crc kubenswrapper[4651]: I1126 15:05:42.378475 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerStarted","Data":"1a71d942a66be87e50d7f32e3d0d4733833800c32392b3b90a8198e7b095a699"} Nov 26 15:05:42 crc kubenswrapper[4651]: I1126 15:05:42.379856 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:42 crc kubenswrapper[4651]: I1126 15:05:42.408073 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-4ggsw" podStartSLOduration=3.408016444 podStartE2EDuration="3.408016444s" podCreationTimestamp="2025-11-26 15:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:42.398979158 +0000 UTC m=+909.824726782" watchObservedRunningTime="2025-11-26 15:05:42.408016444 +0000 UTC m=+909.833764058" Nov 26 15:05:42 crc kubenswrapper[4651]: I1126 15:05:42.921710 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.159739 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:44 crc kubenswrapper[4651]: E1126 15:05:44.159943 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:44 crc kubenswrapper[4651]: E1126 15:05:44.160133 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:44 crc kubenswrapper[4651]: E1126 15:05:44.160189 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:05:48.160172813 +0000 UTC m=+915.585920417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.354518 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9p29z"] Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.355822 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.368437 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9p29z"] Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.370026 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.370407 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.370575 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.416014 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9p29z"] Nov 26 15:05:44 crc kubenswrapper[4651]: E1126 15:05:44.416959 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-gqkc5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-gqkc5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-9p29z" podUID="69adb704-a7e2-4bc2-8165-d99a10b8ca77" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.464879 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465196 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465316 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465484 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465616 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465708 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.465780 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567391 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567453 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567468 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567532 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567569 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567587 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.567627 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.568252 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.568433 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.568774 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.573501 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.579434 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.586358 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:44 crc kubenswrapper[4651]: I1126 15:05:44.589954 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle\") pod \"swift-ring-rebalance-9p29z\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.409874 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.427300 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.483449 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.483842 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484016 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484243 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484382 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484521 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484629 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices\") pod \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\" (UID: \"69adb704-a7e2-4bc2-8165-d99a10b8ca77\") " Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484577 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts" (OuterVolumeSpecName: "scripts") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484594 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.484929 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.485448 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.485553 4651 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69adb704-a7e2-4bc2-8165-d99a10b8ca77-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.485631 4651 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69adb704-a7e2-4bc2-8165-d99a10b8ca77-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.486823 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.487362 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.487995 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.488629 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5" (OuterVolumeSpecName: "kube-api-access-gqkc5") pod "69adb704-a7e2-4bc2-8165-d99a10b8ca77" (UID: "69adb704-a7e2-4bc2-8165-d99a10b8ca77"). InnerVolumeSpecName "kube-api-access-gqkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.587966 4651 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.588464 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.588703 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/69adb704-a7e2-4bc2-8165-d99a10b8ca77-kube-api-access-gqkc5\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:45 crc kubenswrapper[4651]: I1126 15:05:45.588908 4651 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69adb704-a7e2-4bc2-8165-d99a10b8ca77-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.421495 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9p29z" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.478212 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9p29z"] Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.489813 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9p29z"] Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.847339 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c181-account-create-update-prbwt"] Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.848394 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.854722 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.867579 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c181-account-create-update-prbwt"] Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.904342 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-v6vlh"] Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.905593 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.910239 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.910447 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrvr\" (UniqueName: \"kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:46 crc kubenswrapper[4651]: I1126 15:05:46.915579 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v6vlh"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.011685 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrvr\" (UniqueName: \"kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.011984 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.012097 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.012258 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8vv\" (UniqueName: \"kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.012738 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.043987 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrvr\" (UniqueName: \"kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr\") pod \"keystone-c181-account-create-update-prbwt\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.113927 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8vv\" (UniqueName: \"kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.114400 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.114982 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.115164 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wdqdf"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.116329 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.125841 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wdqdf"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.168132 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8vv\" (UniqueName: \"kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv\") pod \"keystone-db-create-v6vlh\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.170556 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.215791 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.216160 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4z52\" (UniqueName: \"kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.230794 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6749-account-create-update-5qhl6"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.231813 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.235468 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.237687 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.242960 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6749-account-create-update-5qhl6"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.322707 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.322771 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6rq\" (UniqueName: \"kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.322860 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4z52\" (UniqueName: \"kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.322879 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.324951 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.364673 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4z52\" (UniqueName: \"kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52\") pod \"placement-db-create-wdqdf\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.367667 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ss5m4"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.368896 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.397175 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ss5m4"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.415201 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69adb704-a7e2-4bc2-8165-d99a10b8ca77" path="/var/lib/kubelet/pods/69adb704-a7e2-4bc2-8165-d99a10b8ca77/volumes" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.425191 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6rq\" (UniqueName: \"kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.425242 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.425309 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.425333 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw7q\" (UniqueName: \"kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.426368 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.442526 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.447591 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6rq\" (UniqueName: \"kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq\") pod \"placement-6749-account-create-update-5qhl6\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.536552 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.536664 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw7q\" (UniqueName: \"kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.538087 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.543922 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d20d-account-create-update-56vgl"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.545380 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.562612 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.565647 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw7q\" (UniqueName: \"kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q\") pod \"glance-db-create-ss5m4\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.579271 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d20d-account-create-update-56vgl"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.638719 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rplt\" (UniqueName: \"kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.638852 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.687540 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.698694 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c181-account-create-update-prbwt"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.699026 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.740111 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.740246 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rplt\" (UniqueName: \"kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.740878 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.756879 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rplt\" (UniqueName: \"kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt\") pod \"glance-d20d-account-create-update-56vgl\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.830202 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v6vlh"] Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.888239 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:47 crc kubenswrapper[4651]: I1126 15:05:47.948774 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wdqdf"] Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.070896 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ss5m4"] Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.162609 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:48 crc kubenswrapper[4651]: E1126 15:05:48.162894 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:48 crc kubenswrapper[4651]: E1126 15:05:48.162914 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:48 crc kubenswrapper[4651]: E1126 15:05:48.162966 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:05:56.162951407 +0000 UTC m=+923.588699011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.276856 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6749-account-create-update-5qhl6"] Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.426420 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d20d-account-create-update-56vgl"] Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.459568 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ss5m4" event={"ID":"9e5f579e-17c0-425c-bd60-0f4950eabdc8","Type":"ContainerStarted","Data":"031b5816df5b664b5159f00fc28e25e3ed1df0c21444a52a867f56056c71a874"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.461580 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6749-account-create-update-5qhl6" event={"ID":"e5a57ad0-61ac-42e0-b0e2-602914415dee","Type":"ContainerStarted","Data":"3967be89ec88114dc6b988fa35f5bfb7e745bd67b189ee0d1ad1ffcc3f90031c"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.466018 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wdqdf" event={"ID":"ccbf26af-444d-421b-8b60-4c6c343564cb","Type":"ContainerStarted","Data":"63b6c5f574909681e2f30909c9ec4f70dc295acd9be30f0a3b390a641bfa9e84"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.466084 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wdqdf" event={"ID":"ccbf26af-444d-421b-8b60-4c6c343564cb","Type":"ContainerStarted","Data":"2866929b708a242ec59aeac979f2b3234136c17515d74fc2e24945c6c52fb69e"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.479070 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c181-account-create-update-prbwt" event={"ID":"a2667dc1-777a-469d-8021-ff0881c8d0d2","Type":"ContainerStarted","Data":"9cb99b3819cc0f226317e4c505f91704ca7220f0491a790cef76642ec169ef8f"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.479117 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c181-account-create-update-prbwt" event={"ID":"a2667dc1-777a-469d-8021-ff0881c8d0d2","Type":"ContainerStarted","Data":"47b58aa17e370e1a938771d35ca3e6e36f96ddaa3f3089c29d637775656c1bbe"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.491609 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v6vlh" event={"ID":"704d880b-4c5f-4663-9bb1-8e40fa9b6752","Type":"ContainerStarted","Data":"8bf7aa7b1c24b5465b8e11f794e46d5703cc19f182d59750ad2f79e409301bf0"} Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.506825 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-wdqdf" podStartSLOduration=1.50680766 podStartE2EDuration="1.50680766s" podCreationTimestamp="2025-11-26 15:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:48.503379497 +0000 UTC m=+915.929127101" watchObservedRunningTime="2025-11-26 15:05:48.50680766 +0000 UTC m=+915.932555264" Nov 26 15:05:48 crc kubenswrapper[4651]: I1126 15:05:48.525622 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c181-account-create-update-prbwt" podStartSLOduration=2.5256023130000003 podStartE2EDuration="2.525602313s" podCreationTimestamp="2025-11-26 15:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:05:48.521668146 +0000 UTC m=+915.947415750" watchObservedRunningTime="2025-11-26 15:05:48.525602313 +0000 UTC m=+915.951349927" Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.499248 4651 generic.go:334] "Generic (PLEG): container finished" podID="ccbf26af-444d-421b-8b60-4c6c343564cb" containerID="63b6c5f574909681e2f30909c9ec4f70dc295acd9be30f0a3b390a641bfa9e84" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.499343 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wdqdf" event={"ID":"ccbf26af-444d-421b-8b60-4c6c343564cb","Type":"ContainerDied","Data":"63b6c5f574909681e2f30909c9ec4f70dc295acd9be30f0a3b390a641bfa9e84"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.500834 4651 generic.go:334] "Generic (PLEG): container finished" podID="a2667dc1-777a-469d-8021-ff0881c8d0d2" containerID="9cb99b3819cc0f226317e4c505f91704ca7220f0491a790cef76642ec169ef8f" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.500917 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c181-account-create-update-prbwt" event={"ID":"a2667dc1-777a-469d-8021-ff0881c8d0d2","Type":"ContainerDied","Data":"9cb99b3819cc0f226317e4c505f91704ca7220f0491a790cef76642ec169ef8f"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.503589 4651 generic.go:334] "Generic (PLEG): container finished" podID="704d880b-4c5f-4663-9bb1-8e40fa9b6752" containerID="e7911fc74bf464bdbbbd1fa053975a83c91c74fa5c7ee1865fb6c895f6b42637" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.503638 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v6vlh" event={"ID":"704d880b-4c5f-4663-9bb1-8e40fa9b6752","Type":"ContainerDied","Data":"e7911fc74bf464bdbbbd1fa053975a83c91c74fa5c7ee1865fb6c895f6b42637"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.504945 4651 generic.go:334] "Generic (PLEG): container finished" podID="eb658c85-2b54-43fb-9938-0c5558ae3da8" containerID="1da4d98336b2602d323333a3b2107cce160186ae704bf29529accdb4d7a714a6" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.505001 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d20d-account-create-update-56vgl" event={"ID":"eb658c85-2b54-43fb-9938-0c5558ae3da8","Type":"ContainerDied","Data":"1da4d98336b2602d323333a3b2107cce160186ae704bf29529accdb4d7a714a6"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.505026 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d20d-account-create-update-56vgl" event={"ID":"eb658c85-2b54-43fb-9938-0c5558ae3da8","Type":"ContainerStarted","Data":"51ad8b7fbbb2386b221fb69564402faae12e601846dffdd943725ddc41a53307"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.506652 4651 generic.go:334] "Generic (PLEG): container finished" podID="9e5f579e-17c0-425c-bd60-0f4950eabdc8" containerID="69034c1c606c777a387ebc575a28e24076e705a9cc33a01fb844194a5080c732" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.506685 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ss5m4" event={"ID":"9e5f579e-17c0-425c-bd60-0f4950eabdc8","Type":"ContainerDied","Data":"69034c1c606c777a387ebc575a28e24076e705a9cc33a01fb844194a5080c732"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.507940 4651 generic.go:334] "Generic (PLEG): container finished" podID="e5a57ad0-61ac-42e0-b0e2-602914415dee" containerID="12001aff69ebce8862f2b559eb46b5f5569de2a9bb0817e8d30888970aee26aa" exitCode=0 Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.507972 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6749-account-create-update-5qhl6" event={"ID":"e5a57ad0-61ac-42e0-b0e2-602914415dee","Type":"ContainerDied","Data":"12001aff69ebce8862f2b559eb46b5f5569de2a9bb0817e8d30888970aee26aa"} Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.647244 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.704374 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:49 crc kubenswrapper[4651]: I1126 15:05:49.704907 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="dnsmasq-dns" containerID="cri-o://877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa" gracePeriod=10 Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.199922 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.380696 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb\") pod \"e0877833-64b2-4651-afcd-17ec0d3b4a44\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.380765 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc\") pod \"e0877833-64b2-4651-afcd-17ec0d3b4a44\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.380832 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26sw\" (UniqueName: \"kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw\") pod \"e0877833-64b2-4651-afcd-17ec0d3b4a44\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.380878 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config\") pod \"e0877833-64b2-4651-afcd-17ec0d3b4a44\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.380917 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb\") pod \"e0877833-64b2-4651-afcd-17ec0d3b4a44\" (UID: \"e0877833-64b2-4651-afcd-17ec0d3b4a44\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.401293 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw" (OuterVolumeSpecName: "kube-api-access-s26sw") pod "e0877833-64b2-4651-afcd-17ec0d3b4a44" (UID: "e0877833-64b2-4651-afcd-17ec0d3b4a44"). InnerVolumeSpecName "kube-api-access-s26sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.460486 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config" (OuterVolumeSpecName: "config") pod "e0877833-64b2-4651-afcd-17ec0d3b4a44" (UID: "e0877833-64b2-4651-afcd-17ec0d3b4a44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.467160 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0877833-64b2-4651-afcd-17ec0d3b4a44" (UID: "e0877833-64b2-4651-afcd-17ec0d3b4a44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.473575 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0877833-64b2-4651-afcd-17ec0d3b4a44" (UID: "e0877833-64b2-4651-afcd-17ec0d3b4a44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.485484 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.485517 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26sw\" (UniqueName: \"kubernetes.io/projected/e0877833-64b2-4651-afcd-17ec0d3b4a44-kube-api-access-s26sw\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.485527 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.485536 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.487087 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0877833-64b2-4651-afcd-17ec0d3b4a44" (UID: "e0877833-64b2-4651-afcd-17ec0d3b4a44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.520695 4651 generic.go:334] "Generic (PLEG): container finished" podID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerID="877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa" exitCode=0 Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.520744 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.520784 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" event={"ID":"e0877833-64b2-4651-afcd-17ec0d3b4a44","Type":"ContainerDied","Data":"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa"} Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.520812 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zchf4" event={"ID":"e0877833-64b2-4651-afcd-17ec0d3b4a44","Type":"ContainerDied","Data":"b3c1637d930c47f0be6cfaf110d52a32753705689b984c0a5b843ead4afce696"} Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.520830 4651 scope.go:117] "RemoveContainer" containerID="877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.561413 4651 scope.go:117] "RemoveContainer" containerID="aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.574886 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.582249 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zchf4"] Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.587232 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0877833-64b2-4651-afcd-17ec0d3b4a44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.592234 4651 scope.go:117] "RemoveContainer" containerID="877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa" Nov 26 15:05:50 crc kubenswrapper[4651]: E1126 15:05:50.592613 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa\": container with ID starting with 877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa not found: ID does not exist" containerID="877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.592658 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa"} err="failed to get container status \"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa\": rpc error: code = NotFound desc = could not find container \"877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa\": container with ID starting with 877dcc89b3999fb2e99ed585112b33678b387e4caeced9c4928d75064fdbf6fa not found: ID does not exist" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.592683 4651 scope.go:117] "RemoveContainer" containerID="aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9" Nov 26 15:05:50 crc kubenswrapper[4651]: E1126 15:05:50.592955 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9\": container with ID starting with aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9 not found: ID does not exist" containerID="aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.592982 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9"} err="failed to get container status \"aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9\": rpc error: code = NotFound desc = could not find container \"aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9\": container with ID starting with aa560f51ef7cab75d6a555a37faba3880576d1e9d7484e99b85aee8292098dc9 not found: ID does not exist" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.849810 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.995581 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6rq\" (UniqueName: \"kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq\") pod \"e5a57ad0-61ac-42e0-b0e2-602914415dee\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.995806 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts\") pod \"e5a57ad0-61ac-42e0-b0e2-602914415dee\" (UID: \"e5a57ad0-61ac-42e0-b0e2-602914415dee\") " Nov 26 15:05:50 crc kubenswrapper[4651]: I1126 15:05:50.996764 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5a57ad0-61ac-42e0-b0e2-602914415dee" (UID: "e5a57ad0-61ac-42e0-b0e2-602914415dee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:50.999376 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq" (OuterVolumeSpecName: "kube-api-access-2h6rq") pod "e5a57ad0-61ac-42e0-b0e2-602914415dee" (UID: "e5a57ad0-61ac-42e0-b0e2-602914415dee"). InnerVolumeSpecName "kube-api-access-2h6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.017713 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.062245 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.070029 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.095533 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.108832 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a57ad0-61ac-42e0-b0e2-602914415dee-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.108861 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6rq\" (UniqueName: \"kubernetes.io/projected/e5a57ad0-61ac-42e0-b0e2-602914415dee-kube-api-access-2h6rq\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.195065 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212364 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rplt\" (UniqueName: \"kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt\") pod \"eb658c85-2b54-43fb-9938-0c5558ae3da8\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212432 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts\") pod \"ccbf26af-444d-421b-8b60-4c6c343564cb\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212458 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8vv\" (UniqueName: \"kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv\") pod \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212479 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts\") pod \"eb658c85-2b54-43fb-9938-0c5558ae3da8\" (UID: \"eb658c85-2b54-43fb-9938-0c5558ae3da8\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212519 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nw7q\" (UniqueName: \"kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q\") pod \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212537 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts\") pod \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\" (UID: \"704d880b-4c5f-4663-9bb1-8e40fa9b6752\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212558 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfrvr\" (UniqueName: \"kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr\") pod \"a2667dc1-777a-469d-8021-ff0881c8d0d2\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212591 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts\") pod \"a2667dc1-777a-469d-8021-ff0881c8d0d2\" (UID: \"a2667dc1-777a-469d-8021-ff0881c8d0d2\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212623 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4z52\" (UniqueName: \"kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52\") pod \"ccbf26af-444d-421b-8b60-4c6c343564cb\" (UID: \"ccbf26af-444d-421b-8b60-4c6c343564cb\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.212649 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts\") pod \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\" (UID: \"9e5f579e-17c0-425c-bd60-0f4950eabdc8\") " Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.213398 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e5f579e-17c0-425c-bd60-0f4950eabdc8" (UID: "9e5f579e-17c0-425c-bd60-0f4950eabdc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.214335 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb658c85-2b54-43fb-9938-0c5558ae3da8" (UID: "eb658c85-2b54-43fb-9938-0c5558ae3da8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.214446 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2667dc1-777a-469d-8021-ff0881c8d0d2" (UID: "a2667dc1-777a-469d-8021-ff0881c8d0d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.214956 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "704d880b-4c5f-4663-9bb1-8e40fa9b6752" (UID: "704d880b-4c5f-4663-9bb1-8e40fa9b6752"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.215774 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ccbf26af-444d-421b-8b60-4c6c343564cb" (UID: "ccbf26af-444d-421b-8b60-4c6c343564cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.226642 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52" (OuterVolumeSpecName: "kube-api-access-s4z52") pod "ccbf26af-444d-421b-8b60-4c6c343564cb" (UID: "ccbf26af-444d-421b-8b60-4c6c343564cb"). InnerVolumeSpecName "kube-api-access-s4z52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.228805 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv" (OuterVolumeSpecName: "kube-api-access-mb8vv") pod "704d880b-4c5f-4663-9bb1-8e40fa9b6752" (UID: "704d880b-4c5f-4663-9bb1-8e40fa9b6752"). InnerVolumeSpecName "kube-api-access-mb8vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.228907 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q" (OuterVolumeSpecName: "kube-api-access-6nw7q") pod "9e5f579e-17c0-425c-bd60-0f4950eabdc8" (UID: "9e5f579e-17c0-425c-bd60-0f4950eabdc8"). InnerVolumeSpecName "kube-api-access-6nw7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.228984 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt" (OuterVolumeSpecName: "kube-api-access-2rplt") pod "eb658c85-2b54-43fb-9938-0c5558ae3da8" (UID: "eb658c85-2b54-43fb-9938-0c5558ae3da8"). InnerVolumeSpecName "kube-api-access-2rplt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.233535 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr" (OuterVolumeSpecName: "kube-api-access-pfrvr") pod "a2667dc1-777a-469d-8021-ff0881c8d0d2" (UID: "a2667dc1-777a-469d-8021-ff0881c8d0d2"). InnerVolumeSpecName "kube-api-access-pfrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314308 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nw7q\" (UniqueName: \"kubernetes.io/projected/9e5f579e-17c0-425c-bd60-0f4950eabdc8-kube-api-access-6nw7q\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314338 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/704d880b-4c5f-4663-9bb1-8e40fa9b6752-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314347 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfrvr\" (UniqueName: \"kubernetes.io/projected/a2667dc1-777a-469d-8021-ff0881c8d0d2-kube-api-access-pfrvr\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314355 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2667dc1-777a-469d-8021-ff0881c8d0d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314363 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4z52\" (UniqueName: \"kubernetes.io/projected/ccbf26af-444d-421b-8b60-4c6c343564cb-kube-api-access-s4z52\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314372 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5f579e-17c0-425c-bd60-0f4950eabdc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314380 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rplt\" (UniqueName: \"kubernetes.io/projected/eb658c85-2b54-43fb-9938-0c5558ae3da8-kube-api-access-2rplt\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314388 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccbf26af-444d-421b-8b60-4c6c343564cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314399 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8vv\" (UniqueName: \"kubernetes.io/projected/704d880b-4c5f-4663-9bb1-8e40fa9b6752-kube-api-access-mb8vv\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.314409 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb658c85-2b54-43fb-9938-0c5558ae3da8-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.411473 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" path="/var/lib/kubelet/pods/e0877833-64b2-4651-afcd-17ec0d3b4a44/volumes" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.529094 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ss5m4" event={"ID":"9e5f579e-17c0-425c-bd60-0f4950eabdc8","Type":"ContainerDied","Data":"031b5816df5b664b5159f00fc28e25e3ed1df0c21444a52a867f56056c71a874"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.529134 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031b5816df5b664b5159f00fc28e25e3ed1df0c21444a52a867f56056c71a874" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.529180 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ss5m4" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.531325 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6749-account-create-update-5qhl6" event={"ID":"e5a57ad0-61ac-42e0-b0e2-602914415dee","Type":"ContainerDied","Data":"3967be89ec88114dc6b988fa35f5bfb7e745bd67b189ee0d1ad1ffcc3f90031c"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.531352 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3967be89ec88114dc6b988fa35f5bfb7e745bd67b189ee0d1ad1ffcc3f90031c" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.531397 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6749-account-create-update-5qhl6" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.533584 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wdqdf" event={"ID":"ccbf26af-444d-421b-8b60-4c6c343564cb","Type":"ContainerDied","Data":"2866929b708a242ec59aeac979f2b3234136c17515d74fc2e24945c6c52fb69e"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.533608 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2866929b708a242ec59aeac979f2b3234136c17515d74fc2e24945c6c52fb69e" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.533658 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wdqdf" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.537626 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c181-account-create-update-prbwt" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.537635 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c181-account-create-update-prbwt" event={"ID":"a2667dc1-777a-469d-8021-ff0881c8d0d2","Type":"ContainerDied","Data":"47b58aa17e370e1a938771d35ca3e6e36f96ddaa3f3089c29d637775656c1bbe"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.537714 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b58aa17e370e1a938771d35ca3e6e36f96ddaa3f3089c29d637775656c1bbe" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.542063 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v6vlh" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.542063 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v6vlh" event={"ID":"704d880b-4c5f-4663-9bb1-8e40fa9b6752","Type":"ContainerDied","Data":"8bf7aa7b1c24b5465b8e11f794e46d5703cc19f182d59750ad2f79e409301bf0"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.542903 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf7aa7b1c24b5465b8e11f794e46d5703cc19f182d59750ad2f79e409301bf0" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.544119 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d20d-account-create-update-56vgl" event={"ID":"eb658c85-2b54-43fb-9938-0c5558ae3da8","Type":"ContainerDied","Data":"51ad8b7fbbb2386b221fb69564402faae12e601846dffdd943725ddc41a53307"} Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.544142 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51ad8b7fbbb2386b221fb69564402faae12e601846dffdd943725ddc41a53307" Nov 26 15:05:51 crc kubenswrapper[4651]: I1126 15:05:51.544188 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d20d-account-create-update-56vgl" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.854737 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6ld6v"] Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855096 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="dnsmasq-dns" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855382 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="dnsmasq-dns" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855402 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb658c85-2b54-43fb-9938-0c5558ae3da8" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855410 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb658c85-2b54-43fb-9938-0c5558ae3da8" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855418 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704d880b-4c5f-4663-9bb1-8e40fa9b6752" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855425 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="704d880b-4c5f-4663-9bb1-8e40fa9b6752" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855436 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf26af-444d-421b-8b60-4c6c343564cb" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855442 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf26af-444d-421b-8b60-4c6c343564cb" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855454 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2667dc1-777a-469d-8021-ff0881c8d0d2" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855460 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2667dc1-777a-469d-8021-ff0881c8d0d2" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855473 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="init" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855478 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="init" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855488 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f579e-17c0-425c-bd60-0f4950eabdc8" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855494 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f579e-17c0-425c-bd60-0f4950eabdc8" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: E1126 15:05:52.855503 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a57ad0-61ac-42e0-b0e2-602914415dee" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855509 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a57ad0-61ac-42e0-b0e2-602914415dee" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855643 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2667dc1-777a-469d-8021-ff0881c8d0d2" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855654 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5f579e-17c0-425c-bd60-0f4950eabdc8" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855660 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb658c85-2b54-43fb-9938-0c5558ae3da8" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855671 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a57ad0-61ac-42e0-b0e2-602914415dee" containerName="mariadb-account-create-update" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855682 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbf26af-444d-421b-8b60-4c6c343564cb" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855692 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0877833-64b2-4651-afcd-17ec0d3b4a44" containerName="dnsmasq-dns" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.855702 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="704d880b-4c5f-4663-9bb1-8e40fa9b6752" containerName="mariadb-database-create" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.856286 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.866094 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.866302 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-62rlj" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.867885 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6ld6v"] Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.943505 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c7xg\" (UniqueName: \"kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.943558 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.943583 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:52 crc kubenswrapper[4651]: I1126 15:05:52.943609 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.044522 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.044897 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c7xg\" (UniqueName: \"kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.044998 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.045115 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.050276 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.052277 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.059714 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.064006 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c7xg\" (UniqueName: \"kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg\") pod \"glance-db-sync-6ld6v\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.188384 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6ld6v" Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.700935 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6ld6v"] Nov 26 15:05:53 crc kubenswrapper[4651]: I1126 15:05:53.915513 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zrhdf" podUID="13f26ce1-fcd6-47bf-b95d-d93e41dd795f" containerName="ovn-controller" probeResult="failure" output=< Nov 26 15:05:53 crc kubenswrapper[4651]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 15:05:53 crc kubenswrapper[4651]: > Nov 26 15:05:54 crc kubenswrapper[4651]: I1126 15:05:54.567065 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6ld6v" event={"ID":"0dfcc6ac-236d-4333-9126-5ee10d1e0417","Type":"ContainerStarted","Data":"a4c2162d1d5e11b3b04d91b69fc4d98bda66bbc6c0c0730b8d00f43a3e0f02ff"} Nov 26 15:05:56 crc kubenswrapper[4651]: I1126 15:05:56.189309 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:05:56 crc kubenswrapper[4651]: E1126 15:05:56.189454 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:05:56 crc kubenswrapper[4651]: E1126 15:05:56.189467 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:05:56 crc kubenswrapper[4651]: E1126 15:05:56.189520 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:06:12.189505982 +0000 UTC m=+939.615253576 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:05:58 crc kubenswrapper[4651]: I1126 15:05:58.914343 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zrhdf" podUID="13f26ce1-fcd6-47bf-b95d-d93e41dd795f" containerName="ovn-controller" probeResult="failure" output=< Nov 26 15:05:58 crc kubenswrapper[4651]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 15:05:58 crc kubenswrapper[4651]: > Nov 26 15:05:58 crc kubenswrapper[4651]: I1126 15:05:58.941372 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:05:58 crc kubenswrapper[4651]: I1126 15:05:58.974877 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4hsfq" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.133014 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.133116 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.133156 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.133769 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.133833 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36" gracePeriod=600 Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.207215 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zrhdf-config-8nddf"] Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.208219 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.211796 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.213828 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zrhdf-config-8nddf"] Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250667 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250739 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250768 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250790 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcn7m\" (UniqueName: \"kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250833 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.250851 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352645 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352714 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352745 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352779 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcn7m\" (UniqueName: \"kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352843 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.352870 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.355181 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.355190 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.355217 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.355534 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.358139 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.376928 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcn7m\" (UniqueName: \"kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m\") pod \"ovn-controller-zrhdf-config-8nddf\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.536529 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.619854 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36" exitCode=0 Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.619951 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36"} Nov 26 15:05:59 crc kubenswrapper[4651]: I1126 15:05:59.619984 4651 scope.go:117] "RemoveContainer" containerID="c9df9330edcd7367fada547dd9b0bad3227c48b21a556e1698b8293c8ff9fe4a" Nov 26 15:06:01 crc kubenswrapper[4651]: I1126 15:06:01.635059 4651 generic.go:334] "Generic (PLEG): container finished" podID="8f351a70-5e04-4270-b9bb-00586a94da1f" containerID="a0186b2a528014a344cc52586bde2af9796297e5acadc7664ee8093e61ffa401" exitCode=0 Nov 26 15:06:01 crc kubenswrapper[4651]: I1126 15:06:01.635245 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f351a70-5e04-4270-b9bb-00586a94da1f","Type":"ContainerDied","Data":"a0186b2a528014a344cc52586bde2af9796297e5acadc7664ee8093e61ffa401"} Nov 26 15:06:01 crc kubenswrapper[4651]: I1126 15:06:01.637076 4651 generic.go:334] "Generic (PLEG): container finished" podID="4fc026e6-8f32-45d0-bab4-c12dd93d946f" containerID="f714b1dc5c1abbf0d0a21f31b86e43500b5361442439a451dbe49beb89edb9ef" exitCode=0 Nov 26 15:06:01 crc kubenswrapper[4651]: I1126 15:06:01.637112 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc026e6-8f32-45d0-bab4-c12dd93d946f","Type":"ContainerDied","Data":"f714b1dc5c1abbf0d0a21f31b86e43500b5361442439a451dbe49beb89edb9ef"} Nov 26 15:06:03 crc kubenswrapper[4651]: I1126 15:06:03.917381 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zrhdf" podUID="13f26ce1-fcd6-47bf-b95d-d93e41dd795f" containerName="ovn-controller" probeResult="failure" output=< Nov 26 15:06:03 crc kubenswrapper[4651]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 26 15:06:03 crc kubenswrapper[4651]: > Nov 26 15:06:07 crc kubenswrapper[4651]: I1126 15:06:07.892136 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zrhdf-config-8nddf"] Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.718821 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.721833 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc026e6-8f32-45d0-bab4-c12dd93d946f","Type":"ContainerStarted","Data":"e39e77a286a39632ce73f2c40db0a791b5e55dce1bd991ae820c35b33a183ac7"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.722071 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.724753 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6ld6v" event={"ID":"0dfcc6ac-236d-4333-9126-5ee10d1e0417","Type":"ContainerStarted","Data":"f05be8fa92b62730f2c248ba7b13e7f3aafdad78aadb18091ced8d6997a062f4"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.728898 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8f351a70-5e04-4270-b9bb-00586a94da1f","Type":"ContainerStarted","Data":"a73805d74154e00057a1f8ca725c4f08da57599082c7d0e8dceb233b22698120"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.729147 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.732340 4651 generic.go:334] "Generic (PLEG): container finished" podID="de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" containerID="872f6cedb7fd99f3da10b91a88fe1cba056c3c05144a395a57e9aa6d6db1ee9c" exitCode=0 Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.732413 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zrhdf-config-8nddf" event={"ID":"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4","Type":"ContainerDied","Data":"872f6cedb7fd99f3da10b91a88fe1cba056c3c05144a395a57e9aa6d6db1ee9c"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.732462 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zrhdf-config-8nddf" event={"ID":"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4","Type":"ContainerStarted","Data":"563b9ba8502c8fec0a4e6ca9c8ef935ed48fac2371ddaa059240ffa09a2d5f8b"} Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.778356 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6ld6v" podStartSLOduration=2.936479746 podStartE2EDuration="16.77833385s" podCreationTimestamp="2025-11-26 15:05:52 +0000 UTC" firstStartedPulling="2025-11-26 15:05:53.70942128 +0000 UTC m=+921.135168884" lastFinishedPulling="2025-11-26 15:06:07.551275384 +0000 UTC m=+934.977022988" observedRunningTime="2025-11-26 15:06:08.775332118 +0000 UTC m=+936.201079722" watchObservedRunningTime="2025-11-26 15:06:08.77833385 +0000 UTC m=+936.204081454" Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.823845 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371950.030952 podStartE2EDuration="1m26.823823441s" podCreationTimestamp="2025-11-26 15:04:42 +0000 UTC" firstStartedPulling="2025-11-26 15:04:44.717298783 +0000 UTC m=+852.143046387" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:08.813328014 +0000 UTC m=+936.239075628" watchObservedRunningTime="2025-11-26 15:06:08.823823441 +0000 UTC m=+936.249571045" Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.857063 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.13399689 podStartE2EDuration="1m25.857014136s" podCreationTimestamp="2025-11-26 15:04:43 +0000 UTC" firstStartedPulling="2025-11-26 15:04:45.253216474 +0000 UTC m=+852.678964088" lastFinishedPulling="2025-11-26 15:05:23.97623373 +0000 UTC m=+891.401981334" observedRunningTime="2025-11-26 15:06:08.852379709 +0000 UTC m=+936.278127323" watchObservedRunningTime="2025-11-26 15:06:08.857014136 +0000 UTC m=+936.282761740" Nov 26 15:06:08 crc kubenswrapper[4651]: I1126 15:06:08.906060 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zrhdf" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.100631 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158105 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158182 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158219 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run" (OuterVolumeSpecName: "var-run") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158238 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158324 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158343 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcn7m\" (UniqueName: \"kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158363 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn\") pod \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\" (UID: \"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4\") " Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158865 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158959 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.158918 4651 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.159323 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts" (OuterVolumeSpecName: "scripts") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.159559 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.168209 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m" (OuterVolumeSpecName: "kube-api-access-mcn7m") pod "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" (UID: "de2a38b7-4f8d-433c-9d3f-47fb6da9bde4"). InnerVolumeSpecName "kube-api-access-mcn7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.260281 4651 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.260324 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.260337 4651 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.260352 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcn7m\" (UniqueName: \"kubernetes.io/projected/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-kube-api-access-mcn7m\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.260365 4651 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.749581 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zrhdf-config-8nddf" event={"ID":"de2a38b7-4f8d-433c-9d3f-47fb6da9bde4","Type":"ContainerDied","Data":"563b9ba8502c8fec0a4e6ca9c8ef935ed48fac2371ddaa059240ffa09a2d5f8b"} Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.750216 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563b9ba8502c8fec0a4e6ca9c8ef935ed48fac2371ddaa059240ffa09a2d5f8b" Nov 26 15:06:10 crc kubenswrapper[4651]: I1126 15:06:10.749624 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zrhdf-config-8nddf" Nov 26 15:06:11 crc kubenswrapper[4651]: I1126 15:06:11.215209 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zrhdf-config-8nddf"] Nov 26 15:06:11 crc kubenswrapper[4651]: I1126 15:06:11.223750 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zrhdf-config-8nddf"] Nov 26 15:06:11 crc kubenswrapper[4651]: I1126 15:06:11.414131 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" path="/var/lib/kubelet/pods/de2a38b7-4f8d-433c-9d3f-47fb6da9bde4/volumes" Nov 26 15:06:12 crc kubenswrapper[4651]: I1126 15:06:12.191818 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:06:12 crc kubenswrapper[4651]: E1126 15:06:12.192018 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:06:12 crc kubenswrapper[4651]: E1126 15:06:12.192216 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:06:12 crc kubenswrapper[4651]: E1126 15:06:12.192272 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:06:44.192255465 +0000 UTC m=+971.618003069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:06:17 crc kubenswrapper[4651]: I1126 15:06:17.799437 4651 generic.go:334] "Generic (PLEG): container finished" podID="0dfcc6ac-236d-4333-9126-5ee10d1e0417" containerID="f05be8fa92b62730f2c248ba7b13e7f3aafdad78aadb18091ced8d6997a062f4" exitCode=0 Nov 26 15:06:17 crc kubenswrapper[4651]: I1126 15:06:17.799507 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6ld6v" event={"ID":"0dfcc6ac-236d-4333-9126-5ee10d1e0417","Type":"ContainerDied","Data":"f05be8fa92b62730f2c248ba7b13e7f3aafdad78aadb18091ced8d6997a062f4"} Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.226270 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6ld6v" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.308824 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data\") pod \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.309645 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c7xg\" (UniqueName: \"kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg\") pod \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.309699 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data\") pod \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.309766 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle\") pod \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\" (UID: \"0dfcc6ac-236d-4333-9126-5ee10d1e0417\") " Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.314213 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0dfcc6ac-236d-4333-9126-5ee10d1e0417" (UID: "0dfcc6ac-236d-4333-9126-5ee10d1e0417"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.317190 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg" (OuterVolumeSpecName: "kube-api-access-4c7xg") pod "0dfcc6ac-236d-4333-9126-5ee10d1e0417" (UID: "0dfcc6ac-236d-4333-9126-5ee10d1e0417"). InnerVolumeSpecName "kube-api-access-4c7xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.332658 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dfcc6ac-236d-4333-9126-5ee10d1e0417" (UID: "0dfcc6ac-236d-4333-9126-5ee10d1e0417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.356210 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data" (OuterVolumeSpecName: "config-data") pod "0dfcc6ac-236d-4333-9126-5ee10d1e0417" (UID: "0dfcc6ac-236d-4333-9126-5ee10d1e0417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.411320 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c7xg\" (UniqueName: \"kubernetes.io/projected/0dfcc6ac-236d-4333-9126-5ee10d1e0417-kube-api-access-4c7xg\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.411359 4651 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.411375 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.411386 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dfcc6ac-236d-4333-9126-5ee10d1e0417-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.817236 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6ld6v" event={"ID":"0dfcc6ac-236d-4333-9126-5ee10d1e0417","Type":"ContainerDied","Data":"a4c2162d1d5e11b3b04d91b69fc4d98bda66bbc6c0c0730b8d00f43a3e0f02ff"} Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.817279 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c2162d1d5e11b3b04d91b69fc4d98bda66bbc6c0c0730b8d00f43a3e0f02ff" Nov 26 15:06:19 crc kubenswrapper[4651]: I1126 15:06:19.817556 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6ld6v" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.284521 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:06:20 crc kubenswrapper[4651]: E1126 15:06:20.285716 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfcc6ac-236d-4333-9126-5ee10d1e0417" containerName="glance-db-sync" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.285795 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfcc6ac-236d-4333-9126-5ee10d1e0417" containerName="glance-db-sync" Nov 26 15:06:20 crc kubenswrapper[4651]: E1126 15:06:20.285879 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" containerName="ovn-config" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.285934 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" containerName="ovn-config" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.286173 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2a38b7-4f8d-433c-9d3f-47fb6da9bde4" containerName="ovn-config" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.286267 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfcc6ac-236d-4333-9126-5ee10d1e0417" containerName="glance-db-sync" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.287187 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.303546 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.425211 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.425313 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.425353 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qv6\" (UniqueName: \"kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.425385 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.425408 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.528757 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.528831 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qv6\" (UniqueName: \"kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.528863 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.528878 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.528947 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.529707 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.529711 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.529986 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.530670 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.549069 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qv6\" (UniqueName: \"kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6\") pod \"dnsmasq-dns-5b946c75cc-5vwzl\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.607601 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:20 crc kubenswrapper[4651]: I1126 15:06:20.891179 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:06:20 crc kubenswrapper[4651]: W1126 15:06:20.905643 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96e9fa03_a4a5_4ecb_8f87_feb41f41083f.slice/crio-fc65e9b354f2229669ad30c011dbe04cc266acdfc3ccf95f0d8d8f250c6fa6c6 WatchSource:0}: Error finding container fc65e9b354f2229669ad30c011dbe04cc266acdfc3ccf95f0d8d8f250c6fa6c6: Status 404 returned error can't find the container with id fc65e9b354f2229669ad30c011dbe04cc266acdfc3ccf95f0d8d8f250c6fa6c6 Nov 26 15:06:21 crc kubenswrapper[4651]: I1126 15:06:21.834230 4651 generic.go:334] "Generic (PLEG): container finished" podID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerID="c56239cc30c2b6bffa54475fe5d1da40ce0443cd645e144b63147f83ffefced7" exitCode=0 Nov 26 15:06:21 crc kubenswrapper[4651]: I1126 15:06:21.834410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" event={"ID":"96e9fa03-a4a5-4ecb-8f87-feb41f41083f","Type":"ContainerDied","Data":"c56239cc30c2b6bffa54475fe5d1da40ce0443cd645e144b63147f83ffefced7"} Nov 26 15:06:21 crc kubenswrapper[4651]: I1126 15:06:21.836271 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" event={"ID":"96e9fa03-a4a5-4ecb-8f87-feb41f41083f","Type":"ContainerStarted","Data":"fc65e9b354f2229669ad30c011dbe04cc266acdfc3ccf95f0d8d8f250c6fa6c6"} Nov 26 15:06:22 crc kubenswrapper[4651]: I1126 15:06:22.847906 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" event={"ID":"96e9fa03-a4a5-4ecb-8f87-feb41f41083f","Type":"ContainerStarted","Data":"38d2aa0310885489fb063b1a8b2ed3ccf0d19b385a8afba9a40c39fe0111e6ef"} Nov 26 15:06:22 crc kubenswrapper[4651]: I1126 15:06:22.848391 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:22 crc kubenswrapper[4651]: I1126 15:06:22.876936 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podStartSLOduration=2.876894481 podStartE2EDuration="2.876894481s" podCreationTimestamp="2025-11-26 15:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:22.871455093 +0000 UTC m=+950.297202707" watchObservedRunningTime="2025-11-26 15:06:22.876894481 +0000 UTC m=+950.302642075" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.109199 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.432733 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.749125 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-28wwh"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.750278 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.775358 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6fkm4"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.776789 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.801990 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28wwh"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.809559 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.809626 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65lgt\" (UniqueName: \"kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.809667 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkr4\" (UniqueName: \"kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.809772 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.819641 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6fkm4"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.845701 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c56c-account-create-update-j5bf5"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.846854 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.855141 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.885969 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c56c-account-create-update-j5bf5"] Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.913911 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.913969 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmw9p\" (UniqueName: \"kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.914012 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.914082 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.914115 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65lgt\" (UniqueName: \"kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.914165 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkr4\" (UniqueName: \"kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.915159 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.915755 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.948252 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65lgt\" (UniqueName: \"kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt\") pod \"barbican-db-create-6fkm4\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:24 crc kubenswrapper[4651]: I1126 15:06:24.953782 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkr4\" (UniqueName: \"kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4\") pod \"cinder-db-create-28wwh\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.026642 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.034650 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmw9p\" (UniqueName: \"kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.036858 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.072683 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.077389 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wvffz"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.079898 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.099626 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.117915 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmw9p\" (UniqueName: \"kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p\") pod \"barbican-c56c-account-create-update-j5bf5\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.120878 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wvffz"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.137941 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.138073 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2t27\" (UniqueName: \"kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.160893 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.230178 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3572-account-create-update-b7cth"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.231418 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.239685 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.239754 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2t27\" (UniqueName: \"kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.240553 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.255352 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.301109 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3572-account-create-update-b7cth"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.316783 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2t27\" (UniqueName: \"kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27\") pod \"neutron-db-create-wvffz\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.342672 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzlh\" (UniqueName: \"kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.342783 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.399415 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bb66-account-create-update-s2ll5"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.400794 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.410608 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.444911 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.445059 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncchj\" (UniqueName: \"kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.445091 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.445144 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzlh\" (UniqueName: \"kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.446484 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb66-account-create-update-s2ll5"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.446929 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.491666 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzlh\" (UniqueName: \"kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh\") pod \"cinder-3572-account-create-update-b7cth\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.537832 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.546791 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncchj\" (UniqueName: \"kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.546888 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.547930 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.576193 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.585318 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-72sf9"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.586252 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-72sf9"] Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.586322 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.590066 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.590149 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.590249 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k89xc" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.590298 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.590378 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncchj\" (UniqueName: \"kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj\") pod \"neutron-bb66-account-create-update-s2ll5\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.651277 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlw7r\" (UniqueName: \"kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.651379 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.651408 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.739529 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.753625 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlw7r\" (UniqueName: \"kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.753956 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.753989 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.760405 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.761515 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.773964 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlw7r\" (UniqueName: \"kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r\") pod \"keystone-db-sync-72sf9\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:25 crc kubenswrapper[4651]: I1126 15:06:25.923060 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.111074 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28wwh"] Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.260893 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c56c-account-create-update-j5bf5"] Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.277995 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6fkm4"] Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.376679 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wvffz"] Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.391115 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3572-account-create-update-b7cth"] Nov 26 15:06:26 crc kubenswrapper[4651]: W1126 15:06:26.402584 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d514364_b561_4d18_9b82_bfd428216060.slice/crio-97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96 WatchSource:0}: Error finding container 97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96: Status 404 returned error can't find the container with id 97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96 Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.658863 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-72sf9"] Nov 26 15:06:26 crc kubenswrapper[4651]: W1126 15:06:26.680870 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae123901_25f9_4788_b666_bcb72066c3c4.slice/crio-48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd WatchSource:0}: Error finding container 48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd: Status 404 returned error can't find the container with id 48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd Nov 26 15:06:26 crc kubenswrapper[4651]: W1126 15:06:26.774959 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf0489a_9b4f_4cd4_95a8_42a5fd115b89.slice/crio-812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0 WatchSource:0}: Error finding container 812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0: Status 404 returned error can't find the container with id 812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0 Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.776899 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb66-account-create-update-s2ll5"] Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.894615 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28wwh" event={"ID":"a78599f6-a349-4abd-b862-37ea4d85818d","Type":"ContainerStarted","Data":"fe75ce8679b38adb7c5ccd3ee6a133021aa2b07458230cccece86e49f311440c"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.894657 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28wwh" event={"ID":"a78599f6-a349-4abd-b862-37ea4d85818d","Type":"ContainerStarted","Data":"fd6466b1d0095ba45452f1402de4aea46a6f2776ff3c666bdee2aa2ca7e51e71"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.899417 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wvffz" event={"ID":"88ffdfc1-d77f-4094-a0ba-2800d4c4d878","Type":"ContainerStarted","Data":"533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.899461 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wvffz" event={"ID":"88ffdfc1-d77f-4094-a0ba-2800d4c4d878","Type":"ContainerStarted","Data":"dda40622ac6750a5ec0a34ea00defb30571e077733a4ba3fa3845d1f25bfcc6d"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.910414 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb66-account-create-update-s2ll5" event={"ID":"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89","Type":"ContainerStarted","Data":"812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.913942 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-28wwh" podStartSLOduration=2.91392387 podStartE2EDuration="2.91392387s" podCreationTimestamp="2025-11-26 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:26.912341037 +0000 UTC m=+954.338088651" watchObservedRunningTime="2025-11-26 15:06:26.91392387 +0000 UTC m=+954.339671474" Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.919437 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72sf9" event={"ID":"ae123901-25f9-4788-b666-bcb72066c3c4","Type":"ContainerStarted","Data":"48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.921856 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6fkm4" event={"ID":"f46f23b6-3605-4160-a29e-b7f2a84b48f5","Type":"ContainerStarted","Data":"a21cad0ca94a05e4d63a0013598a2ae472c411d7f7795fed14423ae73ab97c88"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.922137 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6fkm4" event={"ID":"f46f23b6-3605-4160-a29e-b7f2a84b48f5","Type":"ContainerStarted","Data":"1188cb1e1e47e6989918f7fe1ff4ee3e4661ff5b22b431ccb19d138a93478314"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.931094 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3572-account-create-update-b7cth" event={"ID":"8d514364-b561-4d18-9b82-bfd428216060","Type":"ContainerStarted","Data":"f12a3b347d022be2751ae2ce9c080d9b7a93763a9d10ed87767a90545aec447a"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.931148 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3572-account-create-update-b7cth" event={"ID":"8d514364-b561-4d18-9b82-bfd428216060","Type":"ContainerStarted","Data":"97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.938661 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-wvffz" podStartSLOduration=1.938641714 podStartE2EDuration="1.938641714s" podCreationTimestamp="2025-11-26 15:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:26.931102578 +0000 UTC m=+954.356850182" watchObservedRunningTime="2025-11-26 15:06:26.938641714 +0000 UTC m=+954.364389328" Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.940242 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c56c-account-create-update-j5bf5" event={"ID":"0a7b363a-a7d4-4197-b711-2d3a0b761273","Type":"ContainerStarted","Data":"536eddbf3a5d17779f9e0f97742827c4d08a433086a5f5c60e49071adc3f6111"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.940286 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c56c-account-create-update-j5bf5" event={"ID":"0a7b363a-a7d4-4197-b711-2d3a0b761273","Type":"ContainerStarted","Data":"897776395c11f26ae7820ed89446464d47f457787009010e25c9bdde40ff84fb"} Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.976814 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6fkm4" podStartSLOduration=2.976795984 podStartE2EDuration="2.976795984s" podCreationTimestamp="2025-11-26 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:26.964816647 +0000 UTC m=+954.390564261" watchObservedRunningTime="2025-11-26 15:06:26.976795984 +0000 UTC m=+954.402543588" Nov 26 15:06:26 crc kubenswrapper[4651]: I1126 15:06:26.990562 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3572-account-create-update-b7cth" podStartSLOduration=1.990540319 podStartE2EDuration="1.990540319s" podCreationTimestamp="2025-11-26 15:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:26.98985155 +0000 UTC m=+954.415599154" watchObservedRunningTime="2025-11-26 15:06:26.990540319 +0000 UTC m=+954.416287923" Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.015721 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-c56c-account-create-update-j5bf5" podStartSLOduration=3.015699675 podStartE2EDuration="3.015699675s" podCreationTimestamp="2025-11-26 15:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:27.015004875 +0000 UTC m=+954.440752489" watchObservedRunningTime="2025-11-26 15:06:27.015699675 +0000 UTC m=+954.441447279" Nov 26 15:06:27 crc kubenswrapper[4651]: E1126 15:06:27.320168 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ffdfc1_d77f_4094_a0ba_2800d4c4d878.slice/crio-conmon-533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46f23b6_3605_4160_a29e_b7f2a84b48f5.slice/crio-a21cad0ca94a05e4d63a0013598a2ae472c411d7f7795fed14423ae73ab97c88.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d514364_b561_4d18_9b82_bfd428216060.slice/crio-conmon-f12a3b347d022be2751ae2ce9c080d9b7a93763a9d10ed87767a90545aec447a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ffdfc1_d77f_4094_a0ba_2800d4c4d878.slice/crio-533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b.scope\": RecentStats: unable to find data in memory cache]" Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.961193 4651 generic.go:334] "Generic (PLEG): container finished" podID="0a7b363a-a7d4-4197-b711-2d3a0b761273" containerID="536eddbf3a5d17779f9e0f97742827c4d08a433086a5f5c60e49071adc3f6111" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.961281 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c56c-account-create-update-j5bf5" event={"ID":"0a7b363a-a7d4-4197-b711-2d3a0b761273","Type":"ContainerDied","Data":"536eddbf3a5d17779f9e0f97742827c4d08a433086a5f5c60e49071adc3f6111"} Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.974285 4651 generic.go:334] "Generic (PLEG): container finished" podID="a78599f6-a349-4abd-b862-37ea4d85818d" containerID="fe75ce8679b38adb7c5ccd3ee6a133021aa2b07458230cccece86e49f311440c" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.974362 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28wwh" event={"ID":"a78599f6-a349-4abd-b862-37ea4d85818d","Type":"ContainerDied","Data":"fe75ce8679b38adb7c5ccd3ee6a133021aa2b07458230cccece86e49f311440c"} Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.975839 4651 generic.go:334] "Generic (PLEG): container finished" podID="88ffdfc1-d77f-4094-a0ba-2800d4c4d878" containerID="533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.975894 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wvffz" event={"ID":"88ffdfc1-d77f-4094-a0ba-2800d4c4d878","Type":"ContainerDied","Data":"533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b"} Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.978803 4651 generic.go:334] "Generic (PLEG): container finished" podID="3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" containerID="9d2f12c635705e90d0fc108cd109541e25caf715d5cc945e0cedf40c55377682" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.978848 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb66-account-create-update-s2ll5" event={"ID":"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89","Type":"ContainerDied","Data":"9d2f12c635705e90d0fc108cd109541e25caf715d5cc945e0cedf40c55377682"} Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.981984 4651 generic.go:334] "Generic (PLEG): container finished" podID="f46f23b6-3605-4160-a29e-b7f2a84b48f5" containerID="a21cad0ca94a05e4d63a0013598a2ae472c411d7f7795fed14423ae73ab97c88" exitCode=0 Nov 26 15:06:27 crc kubenswrapper[4651]: I1126 15:06:27.982029 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6fkm4" event={"ID":"f46f23b6-3605-4160-a29e-b7f2a84b48f5","Type":"ContainerDied","Data":"a21cad0ca94a05e4d63a0013598a2ae472c411d7f7795fed14423ae73ab97c88"} Nov 26 15:06:28 crc kubenswrapper[4651]: I1126 15:06:28.003491 4651 generic.go:334] "Generic (PLEG): container finished" podID="8d514364-b561-4d18-9b82-bfd428216060" containerID="f12a3b347d022be2751ae2ce9c080d9b7a93763a9d10ed87767a90545aec447a" exitCode=0 Nov 26 15:06:28 crc kubenswrapper[4651]: I1126 15:06:28.003540 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3572-account-create-update-b7cth" event={"ID":"8d514364-b561-4d18-9b82-bfd428216060","Type":"ContainerDied","Data":"f12a3b347d022be2751ae2ce9c080d9b7a93763a9d10ed87767a90545aec447a"} Nov 26 15:06:30 crc kubenswrapper[4651]: I1126 15:06:30.609172 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:06:30 crc kubenswrapper[4651]: I1126 15:06:30.689261 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:06:30 crc kubenswrapper[4651]: I1126 15:06:30.702074 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-4ggsw" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="dnsmasq-dns" containerID="cri-o://1a71d942a66be87e50d7f32e3d0d4733833800c32392b3b90a8198e7b095a699" gracePeriod=10 Nov 26 15:06:31 crc kubenswrapper[4651]: I1126 15:06:31.036180 4651 generic.go:334] "Generic (PLEG): container finished" podID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerID="1a71d942a66be87e50d7f32e3d0d4733833800c32392b3b90a8198e7b095a699" exitCode=0 Nov 26 15:06:31 crc kubenswrapper[4651]: I1126 15:06:31.036238 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerDied","Data":"1a71d942a66be87e50d7f32e3d0d4733833800c32392b3b90a8198e7b095a699"} Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.574984 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.602060 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.635776 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.662510 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.666326 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.691564 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts\") pod \"8d514364-b561-4d18-9b82-bfd428216060\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.691739 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmw9p\" (UniqueName: \"kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p\") pod \"0a7b363a-a7d4-4197-b711-2d3a0b761273\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.691851 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngzlh\" (UniqueName: \"kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh\") pod \"8d514364-b561-4d18-9b82-bfd428216060\" (UID: \"8d514364-b561-4d18-9b82-bfd428216060\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.691992 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2t27\" (UniqueName: \"kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27\") pod \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692183 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts\") pod \"a78599f6-a349-4abd-b862-37ea4d85818d\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692289 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts\") pod \"0a7b363a-a7d4-4197-b711-2d3a0b761273\" (UID: \"0a7b363a-a7d4-4197-b711-2d3a0b761273\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692484 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65lgt\" (UniqueName: \"kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt\") pod \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692622 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts\") pod \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\" (UID: \"f46f23b6-3605-4160-a29e-b7f2a84b48f5\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692725 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts\") pod \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\" (UID: \"88ffdfc1-d77f-4094-a0ba-2800d4c4d878\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.692828 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gkr4\" (UniqueName: \"kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4\") pod \"a78599f6-a349-4abd-b862-37ea4d85818d\" (UID: \"a78599f6-a349-4abd-b862-37ea4d85818d\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.697379 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a78599f6-a349-4abd-b862-37ea4d85818d" (UID: "a78599f6-a349-4abd-b862-37ea4d85818d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.700733 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a7b363a-a7d4-4197-b711-2d3a0b761273" (UID: "0a7b363a-a7d4-4197-b711-2d3a0b761273"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.701552 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88ffdfc1-d77f-4094-a0ba-2800d4c4d878" (UID: "88ffdfc1-d77f-4094-a0ba-2800d4c4d878"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.702032 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d514364-b561-4d18-9b82-bfd428216060" (UID: "8d514364-b561-4d18-9b82-bfd428216060"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.703549 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f46f23b6-3605-4160-a29e-b7f2a84b48f5" (UID: "f46f23b6-3605-4160-a29e-b7f2a84b48f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.705811 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4" (OuterVolumeSpecName: "kube-api-access-6gkr4") pod "a78599f6-a349-4abd-b862-37ea4d85818d" (UID: "a78599f6-a349-4abd-b862-37ea4d85818d"). InnerVolumeSpecName "kube-api-access-6gkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.709154 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh" (OuterVolumeSpecName: "kube-api-access-ngzlh") pod "8d514364-b561-4d18-9b82-bfd428216060" (UID: "8d514364-b561-4d18-9b82-bfd428216060"). InnerVolumeSpecName "kube-api-access-ngzlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.709851 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p" (OuterVolumeSpecName: "kube-api-access-nmw9p") pod "0a7b363a-a7d4-4197-b711-2d3a0b761273" (UID: "0a7b363a-a7d4-4197-b711-2d3a0b761273"). InnerVolumeSpecName "kube-api-access-nmw9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.717546 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27" (OuterVolumeSpecName: "kube-api-access-p2t27") pod "88ffdfc1-d77f-4094-a0ba-2800d4c4d878" (UID: "88ffdfc1-d77f-4094-a0ba-2800d4c4d878"). InnerVolumeSpecName "kube-api-access-p2t27". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.726848 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt" (OuterVolumeSpecName: "kube-api-access-65lgt") pod "f46f23b6-3605-4160-a29e-b7f2a84b48f5" (UID: "f46f23b6-3605-4160-a29e-b7f2a84b48f5"). InnerVolumeSpecName "kube-api-access-65lgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.729362 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.732743 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793643 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncchj\" (UniqueName: \"kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj\") pod \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793749 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config\") pod \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793772 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb\") pod \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793824 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qfwb\" (UniqueName: \"kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb\") pod \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793850 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc\") pod \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793880 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb\") pod \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\" (UID: \"10d9303a-3724-4f7c-90b2-ef9ba8b92200\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.793905 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts\") pod \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\" (UID: \"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89\") " Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794212 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gkr4\" (UniqueName: \"kubernetes.io/projected/a78599f6-a349-4abd-b862-37ea4d85818d-kube-api-access-6gkr4\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794235 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d514364-b561-4d18-9b82-bfd428216060-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794248 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmw9p\" (UniqueName: \"kubernetes.io/projected/0a7b363a-a7d4-4197-b711-2d3a0b761273-kube-api-access-nmw9p\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794260 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngzlh\" (UniqueName: \"kubernetes.io/projected/8d514364-b561-4d18-9b82-bfd428216060-kube-api-access-ngzlh\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794271 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2t27\" (UniqueName: \"kubernetes.io/projected/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-kube-api-access-p2t27\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794283 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78599f6-a349-4abd-b862-37ea4d85818d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794293 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7b363a-a7d4-4197-b711-2d3a0b761273-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794304 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65lgt\" (UniqueName: \"kubernetes.io/projected/f46f23b6-3605-4160-a29e-b7f2a84b48f5-kube-api-access-65lgt\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794315 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46f23b6-3605-4160-a29e-b7f2a84b48f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794326 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ffdfc1-d77f-4094-a0ba-2800d4c4d878-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.794989 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" (UID: "3bf0489a-9b4f-4cd4-95a8-42a5fd115b89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.798191 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj" (OuterVolumeSpecName: "kube-api-access-ncchj") pod "3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" (UID: "3bf0489a-9b4f-4cd4-95a8-42a5fd115b89"). InnerVolumeSpecName "kube-api-access-ncchj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.799520 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb" (OuterVolumeSpecName: "kube-api-access-8qfwb") pod "10d9303a-3724-4f7c-90b2-ef9ba8b92200" (UID: "10d9303a-3724-4f7c-90b2-ef9ba8b92200"). InnerVolumeSpecName "kube-api-access-8qfwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.836184 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config" (OuterVolumeSpecName: "config") pod "10d9303a-3724-4f7c-90b2-ef9ba8b92200" (UID: "10d9303a-3724-4f7c-90b2-ef9ba8b92200"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.838345 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10d9303a-3724-4f7c-90b2-ef9ba8b92200" (UID: "10d9303a-3724-4f7c-90b2-ef9ba8b92200"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.841678 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10d9303a-3724-4f7c-90b2-ef9ba8b92200" (UID: "10d9303a-3724-4f7c-90b2-ef9ba8b92200"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.852230 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10d9303a-3724-4f7c-90b2-ef9ba8b92200" (UID: "10d9303a-3724-4f7c-90b2-ef9ba8b92200"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.895671 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncchj\" (UniqueName: \"kubernetes.io/projected/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-kube-api-access-ncchj\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896217 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896317 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896393 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qfwb\" (UniqueName: \"kubernetes.io/projected/10d9303a-3724-4f7c-90b2-ef9ba8b92200-kube-api-access-8qfwb\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896493 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896557 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10d9303a-3724-4f7c-90b2-ef9ba8b92200-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:32 crc kubenswrapper[4651]: I1126 15:06:32.896622 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.056801 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c56c-account-create-update-j5bf5" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.056808 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c56c-account-create-update-j5bf5" event={"ID":"0a7b363a-a7d4-4197-b711-2d3a0b761273","Type":"ContainerDied","Data":"897776395c11f26ae7820ed89446464d47f457787009010e25c9bdde40ff84fb"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.056901 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="897776395c11f26ae7820ed89446464d47f457787009010e25c9bdde40ff84fb" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.059355 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28wwh" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.059390 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28wwh" event={"ID":"a78599f6-a349-4abd-b862-37ea4d85818d","Type":"ContainerDied","Data":"fd6466b1d0095ba45452f1402de4aea46a6f2776ff3c666bdee2aa2ca7e51e71"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.059425 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6466b1d0095ba45452f1402de4aea46a6f2776ff3c666bdee2aa2ca7e51e71" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.061899 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4ggsw" event={"ID":"10d9303a-3724-4f7c-90b2-ef9ba8b92200","Type":"ContainerDied","Data":"7a9b122af4309b44dc0fa3136e89615c8c9291be490a1ef9d7591b3d2b29dfd5"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.062128 4651 scope.go:117] "RemoveContainer" containerID="1a71d942a66be87e50d7f32e3d0d4733833800c32392b3b90a8198e7b095a699" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.062339 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4ggsw" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.071777 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wvffz" event={"ID":"88ffdfc1-d77f-4094-a0ba-2800d4c4d878","Type":"ContainerDied","Data":"dda40622ac6750a5ec0a34ea00defb30571e077733a4ba3fa3845d1f25bfcc6d"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.071823 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda40622ac6750a5ec0a34ea00defb30571e077733a4ba3fa3845d1f25bfcc6d" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.071880 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wvffz" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.074371 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb66-account-create-update-s2ll5" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.074371 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb66-account-create-update-s2ll5" event={"ID":"3bf0489a-9b4f-4cd4-95a8-42a5fd115b89","Type":"ContainerDied","Data":"812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.074411 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="812ffcc19c772c961839c7288a248303184c4b45355277a44b1d39aaec6e60b0" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.081469 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72sf9" event={"ID":"ae123901-25f9-4788-b666-bcb72066c3c4","Type":"ContainerStarted","Data":"02c416faca21be52dd878cec856b81bd111a0dfa3543757c666d60e9829e50a2"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.083748 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6fkm4" event={"ID":"f46f23b6-3605-4160-a29e-b7f2a84b48f5","Type":"ContainerDied","Data":"1188cb1e1e47e6989918f7fe1ff4ee3e4661ff5b22b431ccb19d138a93478314"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.083784 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1188cb1e1e47e6989918f7fe1ff4ee3e4661ff5b22b431ccb19d138a93478314" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.083832 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6fkm4" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.094970 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3572-account-create-update-b7cth" event={"ID":"8d514364-b561-4d18-9b82-bfd428216060","Type":"ContainerDied","Data":"97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96"} Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.095210 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ad0bf5cb576b268fbaf90420d63073eb85ba3a406170fd31b1914fd6c30c96" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.095323 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3572-account-create-update-b7cth" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.099571 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-72sf9" podStartSLOduration=2.349960526 podStartE2EDuration="8.099555935s" podCreationTimestamp="2025-11-26 15:06:25 +0000 UTC" firstStartedPulling="2025-11-26 15:06:26.687712875 +0000 UTC m=+954.113460479" lastFinishedPulling="2025-11-26 15:06:32.437308264 +0000 UTC m=+959.863055888" observedRunningTime="2025-11-26 15:06:33.095542246 +0000 UTC m=+960.521289860" watchObservedRunningTime="2025-11-26 15:06:33.099555935 +0000 UTC m=+960.525303539" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.119511 4651 scope.go:117] "RemoveContainer" containerID="cf2111f8328d8ff60aad4296092443239b09e224079a5b1f3cd30a51431b1f50" Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.140004 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.146866 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4ggsw"] Nov 26 15:06:33 crc kubenswrapper[4651]: I1126 15:06:33.420438 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" path="/var/lib/kubelet/pods/10d9303a-3724-4f7c-90b2-ef9ba8b92200/volumes" Nov 26 15:06:37 crc kubenswrapper[4651]: I1126 15:06:37.130930 4651 generic.go:334] "Generic (PLEG): container finished" podID="ae123901-25f9-4788-b666-bcb72066c3c4" containerID="02c416faca21be52dd878cec856b81bd111a0dfa3543757c666d60e9829e50a2" exitCode=0 Nov 26 15:06:37 crc kubenswrapper[4651]: I1126 15:06:37.131024 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72sf9" event={"ID":"ae123901-25f9-4788-b666-bcb72066c3c4","Type":"ContainerDied","Data":"02c416faca21be52dd878cec856b81bd111a0dfa3543757c666d60e9829e50a2"} Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.399594 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.584154 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle\") pod \"ae123901-25f9-4788-b666-bcb72066c3c4\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.584455 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlw7r\" (UniqueName: \"kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r\") pod \"ae123901-25f9-4788-b666-bcb72066c3c4\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.584568 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data\") pod \"ae123901-25f9-4788-b666-bcb72066c3c4\" (UID: \"ae123901-25f9-4788-b666-bcb72066c3c4\") " Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.592365 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r" (OuterVolumeSpecName: "kube-api-access-tlw7r") pod "ae123901-25f9-4788-b666-bcb72066c3c4" (UID: "ae123901-25f9-4788-b666-bcb72066c3c4"). InnerVolumeSpecName "kube-api-access-tlw7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.610635 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae123901-25f9-4788-b666-bcb72066c3c4" (UID: "ae123901-25f9-4788-b666-bcb72066c3c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.665244 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data" (OuterVolumeSpecName: "config-data") pod "ae123901-25f9-4788-b666-bcb72066c3c4" (UID: "ae123901-25f9-4788-b666-bcb72066c3c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.686134 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.686175 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae123901-25f9-4788-b666-bcb72066c3c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:38 crc kubenswrapper[4651]: I1126 15:06:38.686192 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlw7r\" (UniqueName: \"kubernetes.io/projected/ae123901-25f9-4788-b666-bcb72066c3c4-kube-api-access-tlw7r\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.150602 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-72sf9" event={"ID":"ae123901-25f9-4788-b666-bcb72066c3c4","Type":"ContainerDied","Data":"48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd"} Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.151184 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cd3a026956a4a4bfba9ac7f58e3c3b34fc61b48c45760904a96e0dc06fc1dd" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.150658 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-72sf9" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.393910 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394329 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7b363a-a7d4-4197-b711-2d3a0b761273" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394342 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7b363a-a7d4-4197-b711-2d3a0b761273" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394361 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="dnsmasq-dns" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394368 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="dnsmasq-dns" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394386 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="init" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394394 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="init" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394406 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78599f6-a349-4abd-b862-37ea4d85818d" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394414 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78599f6-a349-4abd-b862-37ea4d85818d" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394437 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46f23b6-3605-4160-a29e-b7f2a84b48f5" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394444 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46f23b6-3605-4160-a29e-b7f2a84b48f5" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394454 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae123901-25f9-4788-b666-bcb72066c3c4" containerName="keystone-db-sync" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394461 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae123901-25f9-4788-b666-bcb72066c3c4" containerName="keystone-db-sync" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394476 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ffdfc1-d77f-4094-a0ba-2800d4c4d878" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394484 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ffdfc1-d77f-4094-a0ba-2800d4c4d878" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394496 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d514364-b561-4d18-9b82-bfd428216060" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394504 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d514364-b561-4d18-9b82-bfd428216060" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: E1126 15:06:39.394517 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394523 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394739 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d9303a-3724-4f7c-90b2-ef9ba8b92200" containerName="dnsmasq-dns" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394751 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ffdfc1-d77f-4094-a0ba-2800d4c4d878" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394764 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae123901-25f9-4788-b666-bcb72066c3c4" containerName="keystone-db-sync" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394782 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7b363a-a7d4-4197-b711-2d3a0b761273" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394797 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394808 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46f23b6-3605-4160-a29e-b7f2a84b48f5" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394818 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d514364-b561-4d18-9b82-bfd428216060" containerName="mariadb-account-create-update" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.394827 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78599f6-a349-4abd-b862-37ea4d85818d" containerName="mariadb-database-create" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.395803 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.439090 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.497447 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54fk4\" (UniqueName: \"kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.497823 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.497908 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.497948 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.497980 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.499642 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gqvl5"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.510108 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.513018 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k89xc" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.513267 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.513386 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.513608 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.513711 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.534565 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gqvl5"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.601731 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.601792 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54fk4\" (UniqueName: \"kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.601816 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.601896 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.601927 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602025 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602079 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602109 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602144 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602174 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.602208 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmw5w\" (UniqueName: \"kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.604130 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.604682 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.606877 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.610327 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.626387 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54fk4\" (UniqueName: \"kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4\") pod \"dnsmasq-dns-784f69c749-tzgks\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704297 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704343 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704382 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmw5w\" (UniqueName: \"kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704427 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704445 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.704474 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.709218 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.726511 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.728513 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.731707 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.736674 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.745946 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.792766 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.793989 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.800962 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmw5w\" (UniqueName: \"kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w\") pod \"keystone-bootstrap-gqvl5\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.807709 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.807882 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.807984 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.808161 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pnk7g" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.844653 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.849623 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.911440 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.911492 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.911535 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.911560 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rffk2\" (UniqueName: \"kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.911581 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.933324 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wzxcr"] Nov 26 15:06:39 crc kubenswrapper[4651]: I1126 15:06:39.972453 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:39.999595 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q7r49" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:39.999855 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.000275 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.015331 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wzxcr"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.051881 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.051938 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rffk2\" (UniqueName: \"kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.051972 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.051992 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sff7j\" (UniqueName: \"kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052051 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052092 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052116 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052168 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052216 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052233 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.052290 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.053545 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.063251 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.078358 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.084742 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.101500 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.103955 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.107656 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.107846 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.143636 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rffk2\" (UniqueName: \"kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2\") pod \"horizon-5cf66b5549-hd4q7\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153598 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sff7j\" (UniqueName: \"kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153656 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153676 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153707 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153728 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153750 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153768 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jrbh\" (UniqueName: \"kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153790 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153811 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153830 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153851 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153897 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.153914 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.157148 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.161820 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.171277 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.176274 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.179469 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.176474 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.180160 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.213108 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.220911 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sff7j\" (UniqueName: \"kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j\") pod \"cinder-db-sync-wzxcr\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.255523 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9glnv"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.262082 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.277828 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.277949 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278072 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278113 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jrbh\" (UniqueName: \"kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278157 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278192 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278219 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.278728 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.280166 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.282422 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s27gg" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.282637 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.283240 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.296652 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9glnv"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.297010 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.299553 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.301286 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.302086 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.318849 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-n9whp"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.319926 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.329578 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.333972 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.334101 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t2vhh" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.334150 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.338145 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n9whp"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.350401 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.352224 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.360746 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jrbh\" (UniqueName: \"kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh\") pod \"ceilometer-0\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.386268 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r896s\" (UniqueName: \"kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402244 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402315 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgth5\" (UniqueName: \"kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402429 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402490 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrms2\" (UniqueName: \"kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402526 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.402631 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.391004 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404154 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404278 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404348 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404373 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404393 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.404467 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.429220 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.430550 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.438095 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.438281 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.438381 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-62rlj" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.438860 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.443146 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-p6s6f"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.444258 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.453271 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pv895" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.453528 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.485267 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.488422 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.491598 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-p6s6f"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506282 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r896s\" (UniqueName: \"kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506334 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506354 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgth5\" (UniqueName: \"kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506412 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506437 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrms2\" (UniqueName: \"kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506472 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506501 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506553 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506578 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506620 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506639 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506655 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.506717 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.513240 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.514297 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.526211 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.526600 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.526747 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.527315 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.531496 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.550059 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.560968 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r896s\" (UniqueName: \"kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s\") pod \"neutron-db-sync-9glnv\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.561489 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.561754 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrms2\" (UniqueName: \"kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2\") pod \"dnsmasq-dns-f84976bdf-9tx4g\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.564344 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgth5\" (UniqueName: \"kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.576905 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts\") pod \"placement-db-sync-n9whp\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.610287 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615007 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615097 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615136 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615155 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615172 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhsp\" (UniqueName: \"kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615198 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmdx\" (UniqueName: \"kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615218 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615253 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615283 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615304 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615321 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.615570 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.632535 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9glnv" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.653652 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.667429 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.678315 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.688392 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 26 15:06:40 crc kubenswrapper[4651]: I1126 15:06:40.688747 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.697847 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n9whp" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716267 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716309 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmdx\" (UniqueName: \"kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716332 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716359 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716390 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716406 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716427 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716444 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716490 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716522 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716556 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716573 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggq5\" (UniqueName: \"kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716603 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716623 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716641 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.716693 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhsp\" (UniqueName: \"kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.717384 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.734570 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.739056 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.739284 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.752532 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.758144 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.759227 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.759549 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.764927 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.776644 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhsp\" (UniqueName: \"kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp\") pod \"barbican-db-sync-p6s6f\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.782700 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.783122 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.814823 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.819802 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmdx\" (UniqueName: \"kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx\") pod \"glance-default-external-api-0\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.826863 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.818598 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.842792 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.842855 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggq5\" (UniqueName: \"kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.842906 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.842965 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.842988 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq7dz\" (UniqueName: \"kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843093 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843124 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843169 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843222 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843286 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843319 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.843424 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.844330 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.850505 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.858074 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.887144 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.911788 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggq5\" (UniqueName: \"kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5\") pod \"horizon-779f7b5c77-w99hg\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.944927 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.944990 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.945029 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq7dz\" (UniqueName: \"kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.945365 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.945087 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.960633 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.962554 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.962643 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.962759 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.963568 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.968759 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.968985 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:40.986417 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.036953 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.047472 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.099122 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq7dz\" (UniqueName: \"kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.105637 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.158864 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.190706 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.192223 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.242658 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.268120 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-tzgks" event={"ID":"829d59dd-8bb8-42ae-bbb3-5d44ece8b644","Type":"ContainerStarted","Data":"eb52c5510d9d57cb0cb6304db60a7f1bfd85f6c8e1bf3d79aff08aa4959022fa"} Nov 26 15:06:41 crc kubenswrapper[4651]: I1126 15:06:41.985138 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gqvl5"] Nov 26 15:06:42 crc kubenswrapper[4651]: W1126 15:06:41.991171 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb5a12e_4adf_4864_b97a_b73ec221f326.slice/crio-5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc WatchSource:0}: Error finding container 5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc: Status 404 returned error can't find the container with id 5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.121217 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wzxcr"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.125608 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:06:42 crc kubenswrapper[4651]: W1126 15:06:42.136438 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b19bf8a_8202_4e5f_b3ac_6baa8316f3b5.slice/crio-f41156ae268a3b73a6cc50d77d4c59c8199b12c0b61227af633798d84a30b42e WatchSource:0}: Error finding container f41156ae268a3b73a6cc50d77d4c59c8199b12c0b61227af633798d84a30b42e: Status 404 returned error can't find the container with id f41156ae268a3b73a6cc50d77d4c59c8199b12c0b61227af633798d84a30b42e Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.290728 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gqvl5" event={"ID":"5cb5a12e-4adf-4864-b97a-b73ec221f326","Type":"ContainerStarted","Data":"e2432d3700fe0e8c82c27265682f774fc492ca087f3b84a2b8d108a5881130f4"} Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.290779 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gqvl5" event={"ID":"5cb5a12e-4adf-4864-b97a-b73ec221f326","Type":"ContainerStarted","Data":"5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc"} Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.292935 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf66b5549-hd4q7" event={"ID":"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5","Type":"ContainerStarted","Data":"f41156ae268a3b73a6cc50d77d4c59c8199b12c0b61227af633798d84a30b42e"} Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.294119 4651 generic.go:334] "Generic (PLEG): container finished" podID="829d59dd-8bb8-42ae-bbb3-5d44ece8b644" containerID="792fc31ed9809517198799e782f837b09e6addcab1aa01d9c11c09c21535abe6" exitCode=0 Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.294238 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-tzgks" event={"ID":"829d59dd-8bb8-42ae-bbb3-5d44ece8b644","Type":"ContainerDied","Data":"792fc31ed9809517198799e782f837b09e6addcab1aa01d9c11c09c21535abe6"} Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.296999 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wzxcr" event={"ID":"0b39efce-2985-4f46-91a2-bb397f605c9c","Type":"ContainerStarted","Data":"279b8117b34cd68b30ec015e5a18d3e2e9bfe2da47c31c7b5439909a17b14e13"} Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.327006 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gqvl5" podStartSLOduration=3.326989869 podStartE2EDuration="3.326989869s" podCreationTimestamp="2025-11-26 15:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:42.321169791 +0000 UTC m=+969.746917405" watchObservedRunningTime="2025-11-26 15:06:42.326989869 +0000 UTC m=+969.752737473" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.580933 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.634728 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb\") pod \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.634820 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54fk4\" (UniqueName: \"kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4\") pod \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.634866 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc\") pod \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.634903 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb\") pod \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.635029 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config\") pod \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\" (UID: \"829d59dd-8bb8-42ae-bbb3-5d44ece8b644\") " Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.668221 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4" (OuterVolumeSpecName: "kube-api-access-54fk4") pod "829d59dd-8bb8-42ae-bbb3-5d44ece8b644" (UID: "829d59dd-8bb8-42ae-bbb3-5d44ece8b644"). InnerVolumeSpecName "kube-api-access-54fk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.748638 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "829d59dd-8bb8-42ae-bbb3-5d44ece8b644" (UID: "829d59dd-8bb8-42ae-bbb3-5d44ece8b644"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.752837 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "829d59dd-8bb8-42ae-bbb3-5d44ece8b644" (UID: "829d59dd-8bb8-42ae-bbb3-5d44ece8b644"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.763112 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54fk4\" (UniqueName: \"kubernetes.io/projected/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-kube-api-access-54fk4\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.763200 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.763210 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.783341 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config" (OuterVolumeSpecName: "config") pod "829d59dd-8bb8-42ae-bbb3-5d44ece8b644" (UID: "829d59dd-8bb8-42ae-bbb3-5d44ece8b644"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.783579 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "829d59dd-8bb8-42ae-bbb3-5d44ece8b644" (UID: "829d59dd-8bb8-42ae-bbb3-5d44ece8b644"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.797024 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.834992 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.849670 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.871730 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.872701 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829d59dd-8bb8-42ae-bbb3-5d44ece8b644-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.885864 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9glnv"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.904817 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:06:42 crc kubenswrapper[4651]: E1126 15:06:42.905287 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829d59dd-8bb8-42ae-bbb3-5d44ece8b644" containerName="init" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.905308 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="829d59dd-8bb8-42ae-bbb3-5d44ece8b644" containerName="init" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.905492 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="829d59dd-8bb8-42ae-bbb3-5d44ece8b644" containerName="init" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.906350 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.913805 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.928731 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.958132 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n9whp"] Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.973960 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.974159 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.974265 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.974401 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.974474 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8q96\" (UniqueName: \"kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:42 crc kubenswrapper[4651]: I1126 15:06:42.982233 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.042192 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.071147 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-p6s6f"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.080329 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.080366 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.080402 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.080461 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.080482 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8q96\" (UniqueName: \"kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.081390 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.081501 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.082959 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.103062 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.105856 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8q96\" (UniqueName: \"kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96\") pod \"horizon-64574d9cd7-txx86\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.118709 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.135722 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.164391 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:06:43 crc kubenswrapper[4651]: W1126 15:06:43.174116 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d558494_64d8_42ba_b992_449f4c406597.slice/crio-228978d0392f5251009fd4e87dba51dda9d37d2f9c1e6161c8a371ee58592aca WatchSource:0}: Error finding container 228978d0392f5251009fd4e87dba51dda9d37d2f9c1e6161c8a371ee58592aca: Status 404 returned error can't find the container with id 228978d0392f5251009fd4e87dba51dda9d37d2f9c1e6161c8a371ee58592aca Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.257694 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.355979 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n9whp" event={"ID":"c1259668-c013-4143-b8b4-677a639a764e","Type":"ContainerStarted","Data":"6ca018e12c09b99da245896b883f845a320860d328b4720a5d39c0c48d7e5481"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.357624 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p6s6f" event={"ID":"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b","Type":"ContainerStarted","Data":"65cd413cfccece2e5db5616f1e32b1f1d2d0e01d5fa1548f4c51f4d7e31d59c2"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.360887 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-tzgks" event={"ID":"829d59dd-8bb8-42ae-bbb3-5d44ece8b644","Type":"ContainerDied","Data":"eb52c5510d9d57cb0cb6304db60a7f1bfd85f6c8e1bf3d79aff08aa4959022fa"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.360936 4651 scope.go:117] "RemoveContainer" containerID="792fc31ed9809517198799e782f837b09e6addcab1aa01d9c11c09c21535abe6" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.361117 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-tzgks" Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.407079 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9glnv" event={"ID":"147296af-97b7-4982-ab39-d7f3b78f042d","Type":"ContainerStarted","Data":"9b80947e71d93ff12c1bb485302fc49a88aa84d2e5df5581e3e2036cd5698c74"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543055 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-779f7b5c77-w99hg" event={"ID":"b66e29d8-870f-417d-898a-fd47e5f16215","Type":"ContainerStarted","Data":"2fea52185403975cfbaebcf9adc01f5af5a3f645e09538d36719622b69423669"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543089 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerStarted","Data":"228978d0392f5251009fd4e87dba51dda9d37d2f9c1e6161c8a371ee58592aca"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543101 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543114 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-tzgks"] Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543129 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerStarted","Data":"7cd7167f23f9b849518ab086ccc2fc8a9597a2fd8d6f052c9fba1d00740c643f"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543142 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" event={"ID":"4a69dd81-a01b-4195-a42d-a07126f24904","Type":"ContainerStarted","Data":"6efbe1bea55f1899843cbbf67091397592da732cc4f5d3277ff1c53c35bc6fe3"} Nov 26 15:06:43 crc kubenswrapper[4651]: I1126 15:06:43.543151 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerStarted","Data":"4a21970cb8ddba9e1488e569bb522183c680b2b7bc3af08cf6af25170dcda8a1"} Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.011379 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:06:44 crc kubenswrapper[4651]: W1126 15:06:44.037586 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5446188b_09fd_46a6_acaf_7723dae3c68c.slice/crio-575463b0ab3c844106adebb58dee75a784f305d0f0ba23d2434bb5fe29770012 WatchSource:0}: Error finding container 575463b0ab3c844106adebb58dee75a784f305d0f0ba23d2434bb5fe29770012: Status 404 returned error can't find the container with id 575463b0ab3c844106adebb58dee75a784f305d0f0ba23d2434bb5fe29770012 Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.238354 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:06:44 crc kubenswrapper[4651]: E1126 15:06:44.238576 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:06:44 crc kubenswrapper[4651]: E1126 15:06:44.238598 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:06:44 crc kubenswrapper[4651]: E1126 15:06:44.238670 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:07:48.238653596 +0000 UTC m=+1035.664401200 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.522436 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9glnv" event={"ID":"147296af-97b7-4982-ab39-d7f3b78f042d","Type":"ContainerStarted","Data":"6cb5e96d1bc453c6092225e2597e5183ffb18300836da01876f88d3898c4b4da"} Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.525891 4651 generic.go:334] "Generic (PLEG): container finished" podID="4a69dd81-a01b-4195-a42d-a07126f24904" containerID="e6d71218cad4c1f93f13ac0dfbfca099c5fb7392a35ef5b5135f5d892aa11697" exitCode=0 Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.525950 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" event={"ID":"4a69dd81-a01b-4195-a42d-a07126f24904","Type":"ContainerDied","Data":"e6d71218cad4c1f93f13ac0dfbfca099c5fb7392a35ef5b5135f5d892aa11697"} Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.532527 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64574d9cd7-txx86" event={"ID":"5446188b-09fd-46a6-acaf-7723dae3c68c","Type":"ContainerStarted","Data":"575463b0ab3c844106adebb58dee75a784f305d0f0ba23d2434bb5fe29770012"} Nov 26 15:06:44 crc kubenswrapper[4651]: I1126 15:06:44.553301 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9glnv" podStartSLOduration=4.553284141 podStartE2EDuration="4.553284141s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:44.543480684 +0000 UTC m=+971.969228298" watchObservedRunningTime="2025-11-26 15:06:44.553284141 +0000 UTC m=+971.979031745" Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.432944 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829d59dd-8bb8-42ae-bbb3-5d44ece8b644" path="/var/lib/kubelet/pods/829d59dd-8bb8-42ae-bbb3-5d44ece8b644/volumes" Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.558540 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerStarted","Data":"21e3f43f581cc8789810e3bd83328eadf41c2896b4258a2363e1ab3222ba8a28"} Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.564246 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerStarted","Data":"7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f"} Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.569021 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" event={"ID":"4a69dd81-a01b-4195-a42d-a07126f24904","Type":"ContainerStarted","Data":"0f1f23ffb5d816450678d7df7839804d9680d95e49bf36db95c94b2b879f8dd5"} Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.596168 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" podStartSLOduration=5.596142587 podStartE2EDuration="5.596142587s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:45.587986084 +0000 UTC m=+973.013733698" watchObservedRunningTime="2025-11-26 15:06:45.596142587 +0000 UTC m=+973.021890191" Nov 26 15:06:45 crc kubenswrapper[4651]: I1126 15:06:45.760390 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.613389 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerStarted","Data":"e36d50fff9338b737033c6b4a09b0064ce9a8e1e21ba302863a6876786b9bb77"} Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.614532 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-log" containerID="cri-o://7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f" gracePeriod=30 Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.614600 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-httpd" containerID="cri-o://e36d50fff9338b737033c6b4a09b0064ce9a8e1e21ba302863a6876786b9bb77" gracePeriod=30 Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.621793 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerStarted","Data":"3cd199346d35d91443c454fd6527743909f8bc7debce030fff6d9d552693e8f7"} Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.622093 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-httpd" containerID="cri-o://3cd199346d35d91443c454fd6527743909f8bc7debce030fff6d9d552693e8f7" gracePeriod=30 Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.622255 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-log" containerID="cri-o://21e3f43f581cc8789810e3bd83328eadf41c2896b4258a2363e1ab3222ba8a28" gracePeriod=30 Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.643431 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.643412444 podStartE2EDuration="7.643412444s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:47.637007049 +0000 UTC m=+975.062754653" watchObservedRunningTime="2025-11-26 15:06:47.643412444 +0000 UTC m=+975.069160048" Nov 26 15:06:47 crc kubenswrapper[4651]: I1126 15:06:47.676886 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.676868922 podStartE2EDuration="7.676868922s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:06:47.676668716 +0000 UTC m=+975.102416320" watchObservedRunningTime="2025-11-26 15:06:47.676868922 +0000 UTC m=+975.102616526" Nov 26 15:06:47 crc kubenswrapper[4651]: E1126 15:06:47.825150 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod667e006c_09d8_4551_9e7d_466546b549d8.slice/crio-7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f.scope\": RecentStats: unable to find data in memory cache]" Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.637413 4651 generic.go:334] "Generic (PLEG): container finished" podID="7d558494-64d8-42ba-b992-449f4c406597" containerID="3cd199346d35d91443c454fd6527743909f8bc7debce030fff6d9d552693e8f7" exitCode=0 Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.637879 4651 generic.go:334] "Generic (PLEG): container finished" podID="7d558494-64d8-42ba-b992-449f4c406597" containerID="21e3f43f581cc8789810e3bd83328eadf41c2896b4258a2363e1ab3222ba8a28" exitCode=143 Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.637657 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerDied","Data":"3cd199346d35d91443c454fd6527743909f8bc7debce030fff6d9d552693e8f7"} Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.637955 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerDied","Data":"21e3f43f581cc8789810e3bd83328eadf41c2896b4258a2363e1ab3222ba8a28"} Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.642697 4651 generic.go:334] "Generic (PLEG): container finished" podID="667e006c-09d8-4551-9e7d-466546b549d8" containerID="e36d50fff9338b737033c6b4a09b0064ce9a8e1e21ba302863a6876786b9bb77" exitCode=0 Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.642725 4651 generic.go:334] "Generic (PLEG): container finished" podID="667e006c-09d8-4551-9e7d-466546b549d8" containerID="7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f" exitCode=143 Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.642746 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerDied","Data":"e36d50fff9338b737033c6b4a09b0064ce9a8e1e21ba302863a6876786b9bb77"} Nov 26 15:06:48 crc kubenswrapper[4651]: I1126 15:06:48.642781 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerDied","Data":"7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f"} Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.187000 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.230684 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.243080 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.249396 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.293373 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.363433 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.395800 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.395955 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.396021 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.396123 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4998\" (UniqueName: \"kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.396165 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.396187 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.396210 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.440761 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f54c7c77d-rx8gm"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.442256 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f54c7c77d-rx8gm"] Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.442355 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.497657 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4998\" (UniqueName: \"kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.497745 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.498027 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.498847 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.498914 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.499027 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.499105 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.501072 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.505903 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.508412 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.513301 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.514950 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.516275 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.520548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4998\" (UniqueName: \"kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998\") pod \"horizon-6974b49b94-vzn8h\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601247 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-scripts\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601316 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-config-data\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601360 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-combined-ca-bundle\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601376 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2ltr\" (UniqueName: \"kubernetes.io/projected/5c09de21-84b0-440d-b34c-3054ec6741fc-kube-api-access-z2ltr\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601440 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c09de21-84b0-440d-b34c-3054ec6741fc-logs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601645 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-secret-key\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.601707 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-tls-certs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.611514 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.704886 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-scripts\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.704967 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-config-data\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705016 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-combined-ca-bundle\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705054 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2ltr\" (UniqueName: \"kubernetes.io/projected/5c09de21-84b0-440d-b34c-3054ec6741fc-kube-api-access-z2ltr\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705080 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c09de21-84b0-440d-b34c-3054ec6741fc-logs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705171 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-secret-key\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705203 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-tls-certs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.705815 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c09de21-84b0-440d-b34c-3054ec6741fc-logs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.707158 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-scripts\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.707332 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c09de21-84b0-440d-b34c-3054ec6741fc-config-data\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.709864 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-secret-key\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.711208 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-combined-ca-bundle\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.711597 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c09de21-84b0-440d-b34c-3054ec6741fc-horizon-tls-certs\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.726914 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2ltr\" (UniqueName: \"kubernetes.io/projected/5c09de21-84b0-440d-b34c-3054ec6741fc-kube-api-access-z2ltr\") pod \"horizon-f54c7c77d-rx8gm\" (UID: \"5c09de21-84b0-440d-b34c-3054ec6741fc\") " pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:49 crc kubenswrapper[4651]: I1126 15:06:49.779768 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:06:50 crc kubenswrapper[4651]: I1126 15:06:50.664856 4651 generic.go:334] "Generic (PLEG): container finished" podID="5cb5a12e-4adf-4864-b97a-b73ec221f326" containerID="e2432d3700fe0e8c82c27265682f774fc492ca087f3b84a2b8d108a5881130f4" exitCode=0 Nov 26 15:06:50 crc kubenswrapper[4651]: I1126 15:06:50.664903 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gqvl5" event={"ID":"5cb5a12e-4adf-4864-b97a-b73ec221f326","Type":"ContainerDied","Data":"e2432d3700fe0e8c82c27265682f774fc492ca087f3b84a2b8d108a5881130f4"} Nov 26 15:06:50 crc kubenswrapper[4651]: I1126 15:06:50.763026 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:06:50 crc kubenswrapper[4651]: I1126 15:06:50.824482 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:06:50 crc kubenswrapper[4651]: I1126 15:06:50.824727 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" containerID="cri-o://38d2aa0310885489fb063b1a8b2ed3ccf0d19b385a8afba9a40c39fe0111e6ef" gracePeriod=10 Nov 26 15:06:51 crc kubenswrapper[4651]: I1126 15:06:51.676422 4651 generic.go:334] "Generic (PLEG): container finished" podID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerID="38d2aa0310885489fb063b1a8b2ed3ccf0d19b385a8afba9a40c39fe0111e6ef" exitCode=0 Nov 26 15:06:51 crc kubenswrapper[4651]: I1126 15:06:51.676500 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" event={"ID":"96e9fa03-a4a5-4ecb-8f87-feb41f41083f","Type":"ContainerDied","Data":"38d2aa0310885489fb063b1a8b2ed3ccf0d19b385a8afba9a40c39fe0111e6ef"} Nov 26 15:07:00 crc kubenswrapper[4651]: I1126 15:07:00.609226 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Nov 26 15:07:01 crc kubenswrapper[4651]: E1126 15:07:01.694029 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 26 15:07:01 crc kubenswrapper[4651]: E1126 15:07:01.694505 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57dh97h5cbhfdh7fh54fh57bh56dh64fhddhch5c6h99h4h68fh559hd8hfchbdhbch59chf9h575hb9h55hf8h5d6hchdbh658hbdh89q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lggq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-779f7b5c77-w99hg_openstack(b66e29d8-870f-417d-898a-fd47e5f16215): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:01 crc kubenswrapper[4651]: E1126 15:07:01.697702 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-779f7b5c77-w99hg" podUID="b66e29d8-870f-417d-898a-fd47e5f16215" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.027874 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.028248 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9dh556hch594h689hb9h66ch74h659h5h5cbh549h658h699h5h665h54dhcch678h584h599hdch5ch548h68bh5d7h675h66fh56bh658h675hcbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9jrbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(34a40fec-099f-437f-b32a-2b81bf3b32f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.039162 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.039553 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6ch56fhbbh688h5fchc9h545h68bh678h54fhb7h7dh64h5d6h554h85h584h65fh669h684hbbh54fh57ch8bh68dh68dh666h4h657hcbh5b6h5bbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rffk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5cf66b5549-hd4q7_openstack(6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.041834 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5cf66b5549-hd4q7" podUID="6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.050133 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.050296 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n675h7h676hd5h699h596h55ch55fhfh697h68ch65chcch66fhffh585h9dhc9hdh684h5dch5b6h5fch58h54fh57fh5c6h555h55dh5d7hb4h675q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8q96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-64574d9cd7-txx86_openstack(5446188b-09fd-46a6-acaf-7723dae3c68c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:05 crc kubenswrapper[4651]: E1126 15:07:05.052361 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-64574d9cd7-txx86" podUID="5446188b-09fd-46a6-acaf-7723dae3c68c" Nov 26 15:07:05 crc kubenswrapper[4651]: I1126 15:07:05.609986 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Nov 26 15:07:10 crc kubenswrapper[4651]: I1126 15:07:10.610711 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Nov 26 15:07:10 crc kubenswrapper[4651]: I1126 15:07:10.611832 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:07:11 crc kubenswrapper[4651]: I1126 15:07:11.107129 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:07:11 crc kubenswrapper[4651]: I1126 15:07:11.107182 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:07:11 crc kubenswrapper[4651]: I1126 15:07:11.244009 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:11 crc kubenswrapper[4651]: I1126 15:07:11.245145 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:14 crc kubenswrapper[4651]: I1126 15:07:14.964302 4651 generic.go:334] "Generic (PLEG): container finished" podID="147296af-97b7-4982-ab39-d7f3b78f042d" containerID="6cb5e96d1bc453c6092225e2597e5183ffb18300836da01876f88d3898c4b4da" exitCode=0 Nov 26 15:07:14 crc kubenswrapper[4651]: I1126 15:07:14.964376 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9glnv" event={"ID":"147296af-97b7-4982-ab39-d7f3b78f042d","Type":"ContainerDied","Data":"6cb5e96d1bc453c6092225e2597e5183ffb18300836da01876f88d3898c4b4da"} Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.504445 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.513790 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.514998 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.515110 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.515155 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb\") pod \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.515208 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config\") pod \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.515238 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516743 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc\") pod \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516785 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516824 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qv6\" (UniqueName: \"kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6\") pod \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516907 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmw5w\" (UniqueName: \"kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516959 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb\") pod \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\" (UID: \"96e9fa03-a4a5-4ecb-8f87-feb41f41083f\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.516985 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys\") pod \"5cb5a12e-4adf-4864-b97a-b73ec221f326\" (UID: \"5cb5a12e-4adf-4864-b97a-b73ec221f326\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.544558 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6" (OuterVolumeSpecName: "kube-api-access-p2qv6") pod "96e9fa03-a4a5-4ecb-8f87-feb41f41083f" (UID: "96e9fa03-a4a5-4ecb-8f87-feb41f41083f"). InnerVolumeSpecName "kube-api-access-p2qv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.545253 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts" (OuterVolumeSpecName: "scripts") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.547168 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w" (OuterVolumeSpecName: "kube-api-access-hmw5w") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "kube-api-access-hmw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.547654 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.546115 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.564887 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621525 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621567 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmdx\" (UniqueName: \"kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621601 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621635 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621659 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621701 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621798 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.621823 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run\") pod \"7d558494-64d8-42ba-b992-449f4c406597\" (UID: \"7d558494-64d8-42ba-b992-449f4c406597\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622303 4651 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622324 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622334 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qv6\" (UniqueName: \"kubernetes.io/projected/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-kube-api-access-p2qv6\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622345 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmw5w\" (UniqueName: \"kubernetes.io/projected/5cb5a12e-4adf-4864-b97a-b73ec221f326-kube-api-access-hmw5w\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622355 4651 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.622820 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.628446 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.629822 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts" (OuterVolumeSpecName: "scripts") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.642993 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs" (OuterVolumeSpecName: "logs") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.653770 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.653957 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx" (OuterVolumeSpecName: "kube-api-access-klmdx") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "kube-api-access-klmdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.655868 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config" (OuterVolumeSpecName: "config") pod "96e9fa03-a4a5-4ecb-8f87-feb41f41083f" (UID: "96e9fa03-a4a5-4ecb-8f87-feb41f41083f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.659402 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data" (OuterVolumeSpecName: "config-data") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.665778 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96e9fa03-a4a5-4ecb-8f87-feb41f41083f" (UID: "96e9fa03-a4a5-4ecb-8f87-feb41f41083f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.669842 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cb5a12e-4adf-4864-b97a-b73ec221f326" (UID: "5cb5a12e-4adf-4864-b97a-b73ec221f326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.685866 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.686663 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "96e9fa03-a4a5-4ecb-8f87-feb41f41083f" (UID: "96e9fa03-a4a5-4ecb-8f87-feb41f41083f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.688671 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "96e9fa03-a4a5-4ecb-8f87-feb41f41083f" (UID: "96e9fa03-a4a5-4ecb-8f87-feb41f41083f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.705635 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data" (OuterVolumeSpecName: "config-data") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.705885 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.707561 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d558494-64d8-42ba-b992-449f4c406597" (UID: "7d558494-64d8-42ba-b992-449f4c406597"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.712914 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723103 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723161 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq7dz\" (UniqueName: \"kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723191 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts\") pod \"b66e29d8-870f-417d-898a-fd47e5f16215\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723233 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723268 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723325 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data\") pod \"b66e29d8-870f-417d-898a-fd47e5f16215\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723370 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723390 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723448 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key\") pod \"b66e29d8-870f-417d-898a-fd47e5f16215\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723473 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723499 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs\") pod \"b66e29d8-870f-417d-898a-fd47e5f16215\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723520 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggq5\" (UniqueName: \"kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5\") pod \"b66e29d8-870f-417d-898a-fd47e5f16215\" (UID: \"b66e29d8-870f-417d-898a-fd47e5f16215\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.723523 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.728864 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts\") pod \"667e006c-09d8-4551-9e7d-466546b549d8\" (UID: \"667e006c-09d8-4551-9e7d-466546b549d8\") " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.728943 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts" (OuterVolumeSpecName: "scripts") pod "b66e29d8-870f-417d-898a-fd47e5f16215" (UID: "b66e29d8-870f-417d-898a-fd47e5f16215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731827 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731878 4651 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731891 4651 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731903 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731914 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmdx\" (UniqueName: \"kubernetes.io/projected/7d558494-64d8-42ba-b992-449f4c406597-kube-api-access-klmdx\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731926 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731937 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731948 4651 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731960 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731970 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d558494-64d8-42ba-b992-449f4c406597-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.731981 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.732000 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.732022 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.732036 4651 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558494-64d8-42ba-b992-449f4c406597-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.732129 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96e9fa03-a4a5-4ecb-8f87-feb41f41083f-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.732154 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb5a12e-4adf-4864-b97a-b73ec221f326-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.734022 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data" (OuterVolumeSpecName: "config-data") pod "b66e29d8-870f-417d-898a-fd47e5f16215" (UID: "b66e29d8-870f-417d-898a-fd47e5f16215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.734557 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz" (OuterVolumeSpecName: "kube-api-access-rq7dz") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "kube-api-access-rq7dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.735525 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs" (OuterVolumeSpecName: "logs") pod "b66e29d8-870f-417d-898a-fd47e5f16215" (UID: "b66e29d8-870f-417d-898a-fd47e5f16215"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.735783 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs" (OuterVolumeSpecName: "logs") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.740228 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b66e29d8-870f-417d-898a-fd47e5f16215" (UID: "b66e29d8-870f-417d-898a-fd47e5f16215"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.758426 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.761160 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5" (OuterVolumeSpecName: "kube-api-access-lggq5") pod "b66e29d8-870f-417d-898a-fd47e5f16215" (UID: "b66e29d8-870f-417d-898a-fd47e5f16215"). InnerVolumeSpecName "kube-api-access-lggq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.762640 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts" (OuterVolumeSpecName: "scripts") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.788904 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.800369 4651 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.805385 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.808944 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data" (OuterVolumeSpecName: "config-data") pod "667e006c-09d8-4551-9e7d-466546b549d8" (UID: "667e006c-09d8-4551-9e7d-466546b549d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833683 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833747 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833761 4651 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833772 4651 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b66e29d8-870f-417d-898a-fd47e5f16215-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833783 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/667e006c-09d8-4551-9e7d-466546b549d8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833794 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66e29d8-870f-417d-898a-fd47e5f16215-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833804 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggq5\" (UniqueName: \"kubernetes.io/projected/b66e29d8-870f-417d-898a-fd47e5f16215-kube-api-access-lggq5\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833815 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833826 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq7dz\" (UniqueName: \"kubernetes.io/projected/667e006c-09d8-4551-9e7d-466546b549d8-kube-api-access-rq7dz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833837 4651 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/667e006c-09d8-4551-9e7d-466546b549d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833862 4651 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.833874 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b66e29d8-870f-417d-898a-fd47e5f16215-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.855247 4651 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.936161 4651 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.976989 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gqvl5" event={"ID":"5cb5a12e-4adf-4864-b97a-b73ec221f326","Type":"ContainerDied","Data":"5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc"} Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.976998 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gqvl5" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.977029 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c24f36aa34d4fc6aa835e0aea7e198d1d93d4c838de4484e41df83b9de853fc" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.989349 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-779f7b5c77-w99hg" event={"ID":"b66e29d8-870f-417d-898a-fd47e5f16215","Type":"ContainerDied","Data":"2fea52185403975cfbaebcf9adc01f5af5a3f645e09538d36719622b69423669"} Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.989432 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779f7b5c77-w99hg" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.998625 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d558494-64d8-42ba-b992-449f4c406597","Type":"ContainerDied","Data":"228978d0392f5251009fd4e87dba51dda9d37d2f9c1e6161c8a371ee58592aca"} Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.998748 4651 scope.go:117] "RemoveContainer" containerID="3cd199346d35d91443c454fd6527743909f8bc7debce030fff6d9d552693e8f7" Nov 26 15:07:15 crc kubenswrapper[4651]: I1126 15:07:15.998653 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.004758 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" event={"ID":"96e9fa03-a4a5-4ecb-8f87-feb41f41083f","Type":"ContainerDied","Data":"fc65e9b354f2229669ad30c011dbe04cc266acdfc3ccf95f0d8d8f250c6fa6c6"} Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.004857 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-5vwzl" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.009827 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.010577 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"667e006c-09d8-4551-9e7d-466546b549d8","Type":"ContainerDied","Data":"7cd7167f23f9b849518ab086ccc2fc8a9597a2fd8d6f052c9fba1d00740c643f"} Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.075332 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.094443 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.135592 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136470 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136496 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136518 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136526 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136578 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136586 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136608 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136615 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136630 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="init" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136637 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="init" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136658 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb5a12e-4adf-4864-b97a-b73ec221f326" containerName="keystone-bootstrap" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136671 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb5a12e-4adf-4864-b97a-b73ec221f326" containerName="keystone-bootstrap" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.136690 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.136697 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137127 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" containerName="dnsmasq-dns" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137162 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137189 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="667e006c-09d8-4551-9e7d-466546b549d8" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137207 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-log" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137223 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb5a12e-4adf-4864-b97a-b73ec221f326" containerName="keystone-bootstrap" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.137245 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d558494-64d8-42ba-b992-449f4c406597" containerName="glance-httpd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.139206 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.145263 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-62rlj" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.145536 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.145769 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.146252 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.189114 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.193655 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.200595 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-779f7b5c77-w99hg"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.216555 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.222868 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243243 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243313 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243349 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243365 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243393 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csjxj\" (UniqueName: \"kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243416 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243457 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.243475 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.247588 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.274808 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-5vwzl"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.282150 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.285192 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.287440 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.287646 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.295870 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349271 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349355 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349401 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349425 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349463 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csjxj\" (UniqueName: \"kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349484 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349532 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.349553 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.350790 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.354480 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.356754 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.358288 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.382746 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csjxj\" (UniqueName: \"kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.388736 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.390855 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.401156 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452463 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452607 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452674 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452759 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdkv\" (UniqueName: \"kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452801 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452872 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452927 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.452987 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.491522 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554238 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554287 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554324 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554395 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554427 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554456 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554532 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdkv\" (UniqueName: \"kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.554570 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.555160 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.555951 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.556068 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.560085 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.560315 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.575546 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.576545 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.583543 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdkv\" (UniqueName: \"kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.605970 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.616435 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.686583 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.691686 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.730984 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gqvl5"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.737483 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gqvl5"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.783704 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.810968 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-24pqd"] Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.811973 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.814715 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.817082 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.817327 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.817345 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.817449 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k89xc" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.827995 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-24pqd"] Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.830804 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.830945 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkhsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-p6s6f_openstack(81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:16 crc kubenswrapper[4651]: E1126 15:07:16.832064 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-p6s6f" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858487 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts\") pod \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858557 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs\") pod \"5446188b-09fd-46a6-acaf-7723dae3c68c\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858591 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data\") pod \"5446188b-09fd-46a6-acaf-7723dae3c68c\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858636 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key\") pod \"5446188b-09fd-46a6-acaf-7723dae3c68c\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858661 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8q96\" (UniqueName: \"kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96\") pod \"5446188b-09fd-46a6-acaf-7723dae3c68c\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858707 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rffk2\" (UniqueName: \"kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2\") pod \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858725 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs\") pod \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858755 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts\") pod \"5446188b-09fd-46a6-acaf-7723dae3c68c\" (UID: \"5446188b-09fd-46a6-acaf-7723dae3c68c\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858778 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key\") pod \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858826 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data\") pod \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\" (UID: \"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5\") " Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858832 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs" (OuterVolumeSpecName: "logs") pod "5446188b-09fd-46a6-acaf-7723dae3c68c" (UID: "5446188b-09fd-46a6-acaf-7723dae3c68c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.858936 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts" (OuterVolumeSpecName: "scripts") pod "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" (UID: "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.859218 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.859234 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5446188b-09fd-46a6-acaf-7723dae3c68c-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.859601 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs" (OuterVolumeSpecName: "logs") pod "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" (UID: "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.859820 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data" (OuterVolumeSpecName: "config-data") pod "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" (UID: "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.860354 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts" (OuterVolumeSpecName: "scripts") pod "5446188b-09fd-46a6-acaf-7723dae3c68c" (UID: "5446188b-09fd-46a6-acaf-7723dae3c68c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.860563 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data" (OuterVolumeSpecName: "config-data") pod "5446188b-09fd-46a6-acaf-7723dae3c68c" (UID: "5446188b-09fd-46a6-acaf-7723dae3c68c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.862028 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2" (OuterVolumeSpecName: "kube-api-access-rffk2") pod "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" (UID: "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5"). InnerVolumeSpecName "kube-api-access-rffk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.863898 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" (UID: "6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.864495 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96" (OuterVolumeSpecName: "kube-api-access-q8q96") pod "5446188b-09fd-46a6-acaf-7723dae3c68c" (UID: "5446188b-09fd-46a6-acaf-7723dae3c68c"). InnerVolumeSpecName "kube-api-access-q8q96". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.864868 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5446188b-09fd-46a6-acaf-7723dae3c68c" (UID: "5446188b-09fd-46a6-acaf-7723dae3c68c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961181 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961314 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961349 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961386 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961418 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htdd\" (UniqueName: \"kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961468 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961545 4651 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5446188b-09fd-46a6-acaf-7723dae3c68c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961565 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8q96\" (UniqueName: \"kubernetes.io/projected/5446188b-09fd-46a6-acaf-7723dae3c68c-kube-api-access-q8q96\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961789 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rffk2\" (UniqueName: \"kubernetes.io/projected/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-kube-api-access-rffk2\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961807 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961819 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961830 4651 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961841 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:16 crc kubenswrapper[4651]: I1126 15:07:16.961851 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5446188b-09fd-46a6-acaf-7723dae3c68c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.020226 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cf66b5549-hd4q7" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.020226 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cf66b5549-hd4q7" event={"ID":"6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5","Type":"ContainerDied","Data":"f41156ae268a3b73a6cc50d77d4c59c8199b12c0b61227af633798d84a30b42e"} Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.022427 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64574d9cd7-txx86" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.031387 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64574d9cd7-txx86" event={"ID":"5446188b-09fd-46a6-acaf-7723dae3c68c","Type":"ContainerDied","Data":"575463b0ab3c844106adebb58dee75a784f305d0f0ba23d2434bb5fe29770012"} Nov 26 15:07:17 crc kubenswrapper[4651]: E1126 15:07:17.035849 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-p6s6f" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.062898 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.063352 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.063380 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.063410 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.063431 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htdd\" (UniqueName: \"kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.063471 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.081706 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.095682 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.107720 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.114814 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.115325 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.117806 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.128673 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htdd\" (UniqueName: \"kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd\") pod \"keystone-bootstrap-24pqd\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.132233 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.138219 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64574d9cd7-txx86"] Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.164008 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.183945 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cf66b5549-hd4q7"] Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.455593 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5446188b-09fd-46a6-acaf-7723dae3c68c" path="/var/lib/kubelet/pods/5446188b-09fd-46a6-acaf-7723dae3c68c/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.456309 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb5a12e-4adf-4864-b97a-b73ec221f326" path="/var/lib/kubelet/pods/5cb5a12e-4adf-4864-b97a-b73ec221f326/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.457232 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667e006c-09d8-4551-9e7d-466546b549d8" path="/var/lib/kubelet/pods/667e006c-09d8-4551-9e7d-466546b549d8/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.458436 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5" path="/var/lib/kubelet/pods/6b19bf8a-8202-4e5f-b3ac-6baa8316f3b5/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.458846 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d558494-64d8-42ba-b992-449f4c406597" path="/var/lib/kubelet/pods/7d558494-64d8-42ba-b992-449f4c406597/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.459470 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e9fa03-a4a5-4ecb-8f87-feb41f41083f" path="/var/lib/kubelet/pods/96e9fa03-a4a5-4ecb-8f87-feb41f41083f/volumes" Nov 26 15:07:17 crc kubenswrapper[4651]: I1126 15:07:17.460548 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66e29d8-870f-417d-898a-fd47e5f16215" path="/var/lib/kubelet/pods/b66e29d8-870f-417d-898a-fd47e5f16215/volumes" Nov 26 15:07:18 crc kubenswrapper[4651]: E1126 15:07:18.264005 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 26 15:07:18 crc kubenswrapper[4651]: E1126 15:07:18.264714 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sff7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wzxcr_openstack(0b39efce-2985-4f46-91a2-bb397f605c9c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:07:18 crc kubenswrapper[4651]: E1126 15:07:18.265920 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wzxcr" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.352793 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9glnv" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.484354 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle\") pod \"147296af-97b7-4982-ab39-d7f3b78f042d\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.484540 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r896s\" (UniqueName: \"kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s\") pod \"147296af-97b7-4982-ab39-d7f3b78f042d\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.484641 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config\") pod \"147296af-97b7-4982-ab39-d7f3b78f042d\" (UID: \"147296af-97b7-4982-ab39-d7f3b78f042d\") " Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.501453 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s" (OuterVolumeSpecName: "kube-api-access-r896s") pod "147296af-97b7-4982-ab39-d7f3b78f042d" (UID: "147296af-97b7-4982-ab39-d7f3b78f042d"). InnerVolumeSpecName "kube-api-access-r896s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.507669 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "147296af-97b7-4982-ab39-d7f3b78f042d" (UID: "147296af-97b7-4982-ab39-d7f3b78f042d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.509581 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config" (OuterVolumeSpecName: "config") pod "147296af-97b7-4982-ab39-d7f3b78f042d" (UID: "147296af-97b7-4982-ab39-d7f3b78f042d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.586917 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.586956 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r896s\" (UniqueName: \"kubernetes.io/projected/147296af-97b7-4982-ab39-d7f3b78f042d-kube-api-access-r896s\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.586993 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/147296af-97b7-4982-ab39-d7f3b78f042d-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.668173 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f54c7c77d-rx8gm"] Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.685669 4651 scope.go:117] "RemoveContainer" containerID="21e3f43f581cc8789810e3bd83328eadf41c2896b4258a2363e1ab3222ba8a28" Nov 26 15:07:18 crc kubenswrapper[4651]: W1126 15:07:18.718698 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c09de21_84b0_440d_b34c_3054ec6741fc.slice/crio-0644ed8307061953a2487fc33660bfe8630a0f4395ef246603642d4bce3f689b WatchSource:0}: Error finding container 0644ed8307061953a2487fc33660bfe8630a0f4395ef246603642d4bce3f689b: Status 404 returned error can't find the container with id 0644ed8307061953a2487fc33660bfe8630a0f4395ef246603642d4bce3f689b Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.790406 4651 scope.go:117] "RemoveContainer" containerID="38d2aa0310885489fb063b1a8b2ed3ccf0d19b385a8afba9a40c39fe0111e6ef" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.862265 4651 scope.go:117] "RemoveContainer" containerID="c56239cc30c2b6bffa54475fe5d1da40ce0443cd645e144b63147f83ffefced7" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.901427 4651 scope.go:117] "RemoveContainer" containerID="e36d50fff9338b737033c6b4a09b0064ce9a8e1e21ba302863a6876786b9bb77" Nov 26 15:07:18 crc kubenswrapper[4651]: I1126 15:07:18.943884 4651 scope.go:117] "RemoveContainer" containerID="7df89a1d08339746fdb9f48811448aba828e31833d02951604c1fd95194ebe4f" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.040659 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9glnv" event={"ID":"147296af-97b7-4982-ab39-d7f3b78f042d","Type":"ContainerDied","Data":"9b80947e71d93ff12c1bb485302fc49a88aa84d2e5df5581e3e2036cd5698c74"} Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.041233 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b80947e71d93ff12c1bb485302fc49a88aa84d2e5df5581e3e2036cd5698c74" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.040930 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9glnv" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.045107 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerStarted","Data":"0644ed8307061953a2487fc33660bfe8630a0f4395ef246603642d4bce3f689b"} Nov 26 15:07:19 crc kubenswrapper[4651]: E1126 15:07:19.046403 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wzxcr" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.110426 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.258996 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-24pqd"] Nov 26 15:07:19 crc kubenswrapper[4651]: W1126 15:07:19.264028 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eca4c8f_cc45_46f6_8730_187af536d3b1.slice/crio-e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a WatchSource:0}: Error finding container e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a: Status 404 returned error can't find the container with id e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.274995 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.369803 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.692456 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:07:19 crc kubenswrapper[4651]: E1126 15:07:19.693480 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147296af-97b7-4982-ab39-d7f3b78f042d" containerName="neutron-db-sync" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.701126 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="147296af-97b7-4982-ab39-d7f3b78f042d" containerName="neutron-db-sync" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.701886 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="147296af-97b7-4982-ab39-d7f3b78f042d" containerName="neutron-db-sync" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.704358 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.712268 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.767417 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.768843 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771406 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnxt\" (UniqueName: \"kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771500 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd96t\" (UniqueName: \"kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771529 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771551 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771574 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771601 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771755 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771791 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771833 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.771852 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.776151 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.776344 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s27gg" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.777668 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.780146 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.795845 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872458 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872497 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872526 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hnxt\" (UniqueName: \"kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872564 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd96t\" (UniqueName: \"kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872580 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872595 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872610 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872630 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872725 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.872749 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.873516 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.880117 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.880868 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.881434 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.882332 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.882868 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.887902 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.893304 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.904084 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hnxt\" (UniqueName: \"kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt\") pod \"neutron-746685dd-k8lhz\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:19 crc kubenswrapper[4651]: I1126 15:07:19.913093 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd96t\" (UniqueName: \"kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t\") pod \"dnsmasq-dns-fb745b69-zhts9\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.085264 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerStarted","Data":"982ad101dcca888166edea02c1d706d541f1e1b6586983041bc2e91bcdd03cc4"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.109982 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24pqd" event={"ID":"2eca4c8f-cc45-46f6-8730-187af536d3b1","Type":"ContainerStarted","Data":"d8a8e392b5d7dad2f06ca07da9cb43f1113ca204035247a724c86fcddcc08f46"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.110029 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24pqd" event={"ID":"2eca4c8f-cc45-46f6-8730-187af536d3b1","Type":"ContainerStarted","Data":"e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.116837 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerStarted","Data":"6cd4b5123608a3c9f55fa91ce3ce8580fa91d20c75361541f3add47ebad54264"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.123409 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.123891 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n9whp" event={"ID":"c1259668-c013-4143-b8b4-677a639a764e","Type":"ContainerStarted","Data":"5492b1754fa8232d4cd55c7e05742c9f77be19b628a03f36d29f730caabe2475"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.143293 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-24pqd" podStartSLOduration=4.143274744 podStartE2EDuration="4.143274744s" podCreationTimestamp="2025-11-26 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:20.126169965 +0000 UTC m=+1007.551917569" watchObservedRunningTime="2025-11-26 15:07:20.143274744 +0000 UTC m=+1007.569022348" Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.145268 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.153180 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-n9whp" podStartSLOduration=4.95492639 podStartE2EDuration="40.153164966s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="2025-11-26 15:06:43.030113714 +0000 UTC m=+970.455861318" lastFinishedPulling="2025-11-26 15:07:18.22835229 +0000 UTC m=+1005.654099894" observedRunningTime="2025-11-26 15:07:20.151846909 +0000 UTC m=+1007.577594513" watchObservedRunningTime="2025-11-26 15:07:20.153164966 +0000 UTC m=+1007.578912570" Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.183350 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerStarted","Data":"325a6b30fefeeb361f149dfaba424a0540b7edaa384eea02a3eec7f7e1092edd"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.202351 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerStarted","Data":"390f2bf62f6eb34eef45fe1638494d67d84583759fb9ae8626a53a3607ff275f"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.204705 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerStarted","Data":"50a170759d63aa054145b0a8a35120c8a5e7af6d825d82402ece71dc3ed54d13"} Nov 26 15:07:20 crc kubenswrapper[4651]: I1126 15:07:20.685068 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.096984 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.226792 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerStarted","Data":"fd6292db0e0169b5a21270062d4dc5b40619437121e3ab781cd36981af2b0a5e"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.226834 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerStarted","Data":"2a50bdad0e53b3cbc550079693e3b33fad9be72605f9976a3f3a069bd74e053d"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.239995 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerStarted","Data":"03a38354795026d12fe1593f8ce65d2cc234216b00e4bf3fac8e7cac7d88cb0d"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.247011 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerStarted","Data":"56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.254948 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerStarted","Data":"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.285070 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerStarted","Data":"bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.285132 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerStarted","Data":"041b9e0af2f72c70708cca2245ba415e6c5829af6bb51c79a57997f41bb12658"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.294430 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerStarted","Data":"b18db86a93db98c43ae81c9e05be8dec3a64dcb4461b4f2f64d198b8779af2c0"} Nov 26 15:07:21 crc kubenswrapper[4651]: I1126 15:07:21.336387 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f54c7c77d-rx8gm" podStartSLOduration=31.818898329 podStartE2EDuration="32.336365897s" podCreationTimestamp="2025-11-26 15:06:49 +0000 UTC" firstStartedPulling="2025-11-26 15:07:18.721264715 +0000 UTC m=+1006.147012319" lastFinishedPulling="2025-11-26 15:07:19.238732283 +0000 UTC m=+1006.664479887" observedRunningTime="2025-11-26 15:07:21.282423718 +0000 UTC m=+1008.708171332" watchObservedRunningTime="2025-11-26 15:07:21.336365897 +0000 UTC m=+1008.762113501" Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.311286 4651 generic.go:334] "Generic (PLEG): container finished" podID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerID="fd6292db0e0169b5a21270062d4dc5b40619437121e3ab781cd36981af2b0a5e" exitCode=0 Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.311848 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerDied","Data":"fd6292db0e0169b5a21270062d4dc5b40619437121e3ab781cd36981af2b0a5e"} Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.318710 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerStarted","Data":"60fb8efc9bfc84f98cb543e4dfb966eb4ec00c9ba44ef1e538e94c292bc1b069"} Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.318750 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerStarted","Data":"20a18f2c3458a1d3cd6204c7a47f48a52e3c0eefd1b0c6fd4c15555e3a6d187c"} Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.319472 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.331336 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerStarted","Data":"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88"} Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.345968 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6974b49b94-vzn8h" podStartSLOduration=32.516086636 podStartE2EDuration="33.345949799s" podCreationTimestamp="2025-11-26 15:06:49 +0000 UTC" firstStartedPulling="2025-11-26 15:07:19.122363443 +0000 UTC m=+1006.548111047" lastFinishedPulling="2025-11-26 15:07:19.952226606 +0000 UTC m=+1007.377974210" observedRunningTime="2025-11-26 15:07:21.326453036 +0000 UTC m=+1008.752200650" watchObservedRunningTime="2025-11-26 15:07:22.345949799 +0000 UTC m=+1009.771697403" Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.349663 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerStarted","Data":"9ec0d32784ed5ced6da55fc86d918723f8ec5b5e23f395de2b4b6ddd05c4482a"} Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.373734 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.37371665 podStartE2EDuration="6.37371665s" podCreationTimestamp="2025-11-26 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:22.372195408 +0000 UTC m=+1009.797943032" watchObservedRunningTime="2025-11-26 15:07:22.37371665 +0000 UTC m=+1009.799464254" Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.427843 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-746685dd-k8lhz" podStartSLOduration=3.427822734 podStartE2EDuration="3.427822734s" podCreationTimestamp="2025-11-26 15:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:22.419819464 +0000 UTC m=+1009.845567068" watchObservedRunningTime="2025-11-26 15:07:22.427822734 +0000 UTC m=+1009.853570338" Nov 26 15:07:22 crc kubenswrapper[4651]: I1126 15:07:22.452184 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.452167941 podStartE2EDuration="6.452167941s" podCreationTimestamp="2025-11-26 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:22.44848105 +0000 UTC m=+1009.874228654" watchObservedRunningTime="2025-11-26 15:07:22.452167941 +0000 UTC m=+1009.877915545" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.364459 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerStarted","Data":"1edbe76f7461d0e15bb50e3d25c46101540fe7fa380949172e2d3e161d96af11"} Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.365347 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.664249 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-zhts9" podStartSLOduration=4.664227664 podStartE2EDuration="4.664227664s" podCreationTimestamp="2025-11-26 15:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:23.39491899 +0000 UTC m=+1010.820666594" watchObservedRunningTime="2025-11-26 15:07:23.664227664 +0000 UTC m=+1011.089975268" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.671142 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5659bd5cb9-6wbmd"] Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.673294 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.685714 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.686078 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5659bd5cb9-6wbmd"] Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.687622 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768028 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768094 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-ovndb-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768136 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-internal-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768153 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-public-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768423 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-combined-ca-bundle\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768523 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-httpd-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.768550 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl6hd\" (UniqueName: \"kubernetes.io/projected/792fb64e-7839-4871-9f5e-3799da118c4d-kube-api-access-hl6hd\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.870697 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-combined-ca-bundle\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.871885 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-httpd-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.871916 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl6hd\" (UniqueName: \"kubernetes.io/projected/792fb64e-7839-4871-9f5e-3799da118c4d-kube-api-access-hl6hd\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.872004 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.872028 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-ovndb-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.872081 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-internal-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.872096 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-public-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.882412 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.893681 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-internal-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.894292 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-combined-ca-bundle\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.894854 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-ovndb-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.894938 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-httpd-config\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.895521 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/792fb64e-7839-4871-9f5e-3799da118c4d-public-tls-certs\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:23 crc kubenswrapper[4651]: I1126 15:07:23.917990 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl6hd\" (UniqueName: \"kubernetes.io/projected/792fb64e-7839-4871-9f5e-3799da118c4d-kube-api-access-hl6hd\") pod \"neutron-5659bd5cb9-6wbmd\" (UID: \"792fb64e-7839-4871-9f5e-3799da118c4d\") " pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:24 crc kubenswrapper[4651]: I1126 15:07:24.001075 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:24 crc kubenswrapper[4651]: I1126 15:07:24.723209 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5659bd5cb9-6wbmd"] Nov 26 15:07:24 crc kubenswrapper[4651]: W1126 15:07:24.747186 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod792fb64e_7839_4871_9f5e_3799da118c4d.slice/crio-8980c009baf2200b31267439d99b1cea04eac74aec1525cf7e5fa078dd62895c WatchSource:0}: Error finding container 8980c009baf2200b31267439d99b1cea04eac74aec1525cf7e5fa078dd62895c: Status 404 returned error can't find the container with id 8980c009baf2200b31267439d99b1cea04eac74aec1525cf7e5fa078dd62895c Nov 26 15:07:25 crc kubenswrapper[4651]: I1126 15:07:25.413529 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5659bd5cb9-6wbmd" event={"ID":"792fb64e-7839-4871-9f5e-3799da118c4d","Type":"ContainerStarted","Data":"8980c009baf2200b31267439d99b1cea04eac74aec1525cf7e5fa078dd62895c"} Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.417452 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5659bd5cb9-6wbmd" event={"ID":"792fb64e-7839-4871-9f5e-3799da118c4d","Type":"ContainerStarted","Data":"4a100df649fd7727c88260ac078d56c56c294eb5d4ccbfd25ad298006175ee13"} Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.617677 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.617752 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.651505 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.664254 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.784401 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.784464 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.819321 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:07:26 crc kubenswrapper[4651]: I1126 15:07:26.850469 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:07:27 crc kubenswrapper[4651]: I1126 15:07:27.426893 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:07:27 crc kubenswrapper[4651]: I1126 15:07:27.427255 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:27 crc kubenswrapper[4651]: I1126 15:07:27.427267 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:07:27 crc kubenswrapper[4651]: I1126 15:07:27.427276 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:28 crc kubenswrapper[4651]: I1126 15:07:28.436554 4651 generic.go:334] "Generic (PLEG): container finished" podID="c1259668-c013-4143-b8b4-677a639a764e" containerID="5492b1754fa8232d4cd55c7e05742c9f77be19b628a03f36d29f730caabe2475" exitCode=0 Nov 26 15:07:28 crc kubenswrapper[4651]: I1126 15:07:28.436633 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n9whp" event={"ID":"c1259668-c013-4143-b8b4-677a639a764e","Type":"ContainerDied","Data":"5492b1754fa8232d4cd55c7e05742c9f77be19b628a03f36d29f730caabe2475"} Nov 26 15:07:29 crc kubenswrapper[4651]: I1126 15:07:29.619151 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:07:29 crc kubenswrapper[4651]: I1126 15:07:29.620335 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:07:29 crc kubenswrapper[4651]: I1126 15:07:29.780900 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:07:29 crc kubenswrapper[4651]: I1126 15:07:29.781495 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.125245 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.204408 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.204831 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="dnsmasq-dns" containerID="cri-o://0f1f23ffb5d816450678d7df7839804d9680d95e49bf36db95c94b2b879f8dd5" gracePeriod=10 Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.469697 4651 generic.go:334] "Generic (PLEG): container finished" podID="4a69dd81-a01b-4195-a42d-a07126f24904" containerID="0f1f23ffb5d816450678d7df7839804d9680d95e49bf36db95c94b2b879f8dd5" exitCode=0 Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.470948 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" event={"ID":"4a69dd81-a01b-4195-a42d-a07126f24904","Type":"ContainerDied","Data":"0f1f23ffb5d816450678d7df7839804d9680d95e49bf36db95c94b2b879f8dd5"} Nov 26 15:07:30 crc kubenswrapper[4651]: I1126 15:07:30.760649 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Nov 26 15:07:31 crc kubenswrapper[4651]: I1126 15:07:31.484404 4651 generic.go:334] "Generic (PLEG): container finished" podID="2eca4c8f-cc45-46f6-8730-187af536d3b1" containerID="d8a8e392b5d7dad2f06ca07da9cb43f1113ca204035247a724c86fcddcc08f46" exitCode=0 Nov 26 15:07:31 crc kubenswrapper[4651]: I1126 15:07:31.484502 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24pqd" event={"ID":"2eca4c8f-cc45-46f6-8730-187af536d3b1","Type":"ContainerDied","Data":"d8a8e392b5d7dad2f06ca07da9cb43f1113ca204035247a724c86fcddcc08f46"} Nov 26 15:07:31 crc kubenswrapper[4651]: I1126 15:07:31.990479 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n9whp" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.073321 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgth5\" (UniqueName: \"kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5\") pod \"c1259668-c013-4143-b8b4-677a639a764e\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.073385 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data\") pod \"c1259668-c013-4143-b8b4-677a639a764e\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.073408 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts\") pod \"c1259668-c013-4143-b8b4-677a639a764e\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.073473 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle\") pod \"c1259668-c013-4143-b8b4-677a639a764e\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.073600 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs\") pod \"c1259668-c013-4143-b8b4-677a639a764e\" (UID: \"c1259668-c013-4143-b8b4-677a639a764e\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.074538 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs" (OuterVolumeSpecName: "logs") pod "c1259668-c013-4143-b8b4-677a639a764e" (UID: "c1259668-c013-4143-b8b4-677a639a764e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.102706 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts" (OuterVolumeSpecName: "scripts") pod "c1259668-c013-4143-b8b4-677a639a764e" (UID: "c1259668-c013-4143-b8b4-677a639a764e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.103082 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5" (OuterVolumeSpecName: "kube-api-access-mgth5") pod "c1259668-c013-4143-b8b4-677a639a764e" (UID: "c1259668-c013-4143-b8b4-677a639a764e"). InnerVolumeSpecName "kube-api-access-mgth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.120181 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data" (OuterVolumeSpecName: "config-data") pod "c1259668-c013-4143-b8b4-677a639a764e" (UID: "c1259668-c013-4143-b8b4-677a639a764e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.165115 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1259668-c013-4143-b8b4-677a639a764e" (UID: "c1259668-c013-4143-b8b4-677a639a764e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.176494 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1259668-c013-4143-b8b4-677a639a764e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.176530 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgth5\" (UniqueName: \"kubernetes.io/projected/c1259668-c013-4143-b8b4-677a639a764e-kube-api-access-mgth5\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.176541 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.176552 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.176561 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1259668-c013-4143-b8b4-677a639a764e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.357864 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.480103 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb\") pod \"4a69dd81-a01b-4195-a42d-a07126f24904\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.480249 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config\") pod \"4a69dd81-a01b-4195-a42d-a07126f24904\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.480410 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc\") pod \"4a69dd81-a01b-4195-a42d-a07126f24904\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.480500 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb\") pod \"4a69dd81-a01b-4195-a42d-a07126f24904\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.480587 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrms2\" (UniqueName: \"kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2\") pod \"4a69dd81-a01b-4195-a42d-a07126f24904\" (UID: \"4a69dd81-a01b-4195-a42d-a07126f24904\") " Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.494469 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5659bd5cb9-6wbmd" event={"ID":"792fb64e-7839-4871-9f5e-3799da118c4d","Type":"ContainerStarted","Data":"8a0a49e6799cb8bb06047f9823616659f726040ec2755066941e8a84666e38b4"} Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.494539 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.497406 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2" (OuterVolumeSpecName: "kube-api-access-qrms2") pod "4a69dd81-a01b-4195-a42d-a07126f24904" (UID: "4a69dd81-a01b-4195-a42d-a07126f24904"). InnerVolumeSpecName "kube-api-access-qrms2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.499347 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" event={"ID":"4a69dd81-a01b-4195-a42d-a07126f24904","Type":"ContainerDied","Data":"6efbe1bea55f1899843cbbf67091397592da732cc4f5d3277ff1c53c35bc6fe3"} Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.499411 4651 scope.go:117] "RemoveContainer" containerID="0f1f23ffb5d816450678d7df7839804d9680d95e49bf36db95c94b2b879f8dd5" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.499362 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-9tx4g" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.504946 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n9whp" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.507132 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n9whp" event={"ID":"c1259668-c013-4143-b8b4-677a639a764e","Type":"ContainerDied","Data":"6ca018e12c09b99da245896b883f845a320860d328b4720a5d39c0c48d7e5481"} Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.507205 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ca018e12c09b99da245896b883f845a320860d328b4720a5d39c0c48d7e5481" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.549069 4651 scope.go:117] "RemoveContainer" containerID="e6d71218cad4c1f93f13ac0dfbfca099c5fb7392a35ef5b5135f5d892aa11697" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.577348 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a69dd81-a01b-4195-a42d-a07126f24904" (UID: "4a69dd81-a01b-4195-a42d-a07126f24904"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.582961 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.583000 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrms2\" (UniqueName: \"kubernetes.io/projected/4a69dd81-a01b-4195-a42d-a07126f24904-kube-api-access-qrms2\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.601443 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a69dd81-a01b-4195-a42d-a07126f24904" (UID: "4a69dd81-a01b-4195-a42d-a07126f24904"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.604676 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a69dd81-a01b-4195-a42d-a07126f24904" (UID: "4a69dd81-a01b-4195-a42d-a07126f24904"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.621622 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config" (OuterVolumeSpecName: "config") pod "4a69dd81-a01b-4195-a42d-a07126f24904" (UID: "4a69dd81-a01b-4195-a42d-a07126f24904"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.684504 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.684536 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.684547 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69dd81-a01b-4195-a42d-a07126f24904-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.842063 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5659bd5cb9-6wbmd" podStartSLOduration=9.842024386 podStartE2EDuration="9.842024386s" podCreationTimestamp="2025-11-26 15:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:32.537595449 +0000 UTC m=+1019.963343063" watchObservedRunningTime="2025-11-26 15:07:32.842024386 +0000 UTC m=+1020.267772010" Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.850991 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:07:32 crc kubenswrapper[4651]: I1126 15:07:32.858847 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-9tx4g"] Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.145541 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d7dcdb968-2bhkx"] Nov 26 15:07:33 crc kubenswrapper[4651]: E1126 15:07:33.146150 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1259668-c013-4143-b8b4-677a639a764e" containerName="placement-db-sync" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.146165 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1259668-c013-4143-b8b4-677a639a764e" containerName="placement-db-sync" Nov 26 15:07:33 crc kubenswrapper[4651]: E1126 15:07:33.146181 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="dnsmasq-dns" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.146194 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="dnsmasq-dns" Nov 26 15:07:33 crc kubenswrapper[4651]: E1126 15:07:33.146211 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="init" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.146218 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="init" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.146369 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1259668-c013-4143-b8b4-677a639a764e" containerName="placement-db-sync" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.146409 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" containerName="dnsmasq-dns" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.147209 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.152574 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t2vhh" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.152681 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.152776 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.153011 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.153194 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.155970 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.171851 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d7dcdb968-2bhkx"] Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.199714 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.199853 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.199933 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htdd\" (UniqueName: \"kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.199982 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200009 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200050 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys\") pod \"2eca4c8f-cc45-46f6-8730-187af536d3b1\" (UID: \"2eca4c8f-cc45-46f6-8730-187af536d3b1\") " Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200249 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50553d30-1881-42f4-9e57-224db8e5be3c-logs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200301 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlq57\" (UniqueName: \"kubernetes.io/projected/50553d30-1881-42f4-9e57-224db8e5be3c-kube-api-access-nlq57\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200336 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-internal-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200370 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-combined-ca-bundle\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200387 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-config-data\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200420 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-public-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.200436 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-scripts\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.225410 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts" (OuterVolumeSpecName: "scripts") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.229530 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd" (OuterVolumeSpecName: "kube-api-access-5htdd") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "kube-api-access-5htdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.248176 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.249720 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.250767 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.289597 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data" (OuterVolumeSpecName: "config-data") pod "2eca4c8f-cc45-46f6-8730-187af536d3b1" (UID: "2eca4c8f-cc45-46f6-8730-187af536d3b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305449 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-combined-ca-bundle\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305532 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-config-data\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305635 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-public-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305663 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-scripts\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305866 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50553d30-1881-42f4-9e57-224db8e5be3c-logs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.305961 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlq57\" (UniqueName: \"kubernetes.io/projected/50553d30-1881-42f4-9e57-224db8e5be3c-kube-api-access-nlq57\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306063 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-internal-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306485 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htdd\" (UniqueName: \"kubernetes.io/projected/2eca4c8f-cc45-46f6-8730-187af536d3b1-kube-api-access-5htdd\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306503 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306515 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306525 4651 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306536 4651 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.306552 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eca4c8f-cc45-46f6-8730-187af536d3b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.310778 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50553d30-1881-42f4-9e57-224db8e5be3c-logs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.311122 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-scripts\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.312923 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-internal-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.315241 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-combined-ca-bundle\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.325691 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-public-tls-certs\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.332008 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlq57\" (UniqueName: \"kubernetes.io/projected/50553d30-1881-42f4-9e57-224db8e5be3c-kube-api-access-nlq57\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.335807 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50553d30-1881-42f4-9e57-224db8e5be3c-config-data\") pod \"placement-6d7dcdb968-2bhkx\" (UID: \"50553d30-1881-42f4-9e57-224db8e5be3c\") " pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.425858 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a69dd81-a01b-4195-a42d-a07126f24904" path="/var/lib/kubelet/pods/4a69dd81-a01b-4195-a42d-a07126f24904/volumes" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.475920 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t2vhh" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.484307 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.518715 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p6s6f" event={"ID":"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b","Type":"ContainerStarted","Data":"6d1911b8e56de816e038b6661a78c7991b66e0a55e28c0d04c6a75bd20e26e83"} Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.544852 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerStarted","Data":"3acf0890d832aef9e73def3a879e5bd074b7a482d3f77ce9eaebc8d3c9e6db46"} Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.545861 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-p6s6f" podStartSLOduration=3.689728493 podStartE2EDuration="53.545844554s" podCreationTimestamp="2025-11-26 15:06:40 +0000 UTC" firstStartedPulling="2025-11-26 15:06:43.135059455 +0000 UTC m=+970.560807059" lastFinishedPulling="2025-11-26 15:07:32.991175526 +0000 UTC m=+1020.416923120" observedRunningTime="2025-11-26 15:07:33.537368772 +0000 UTC m=+1020.963116376" watchObservedRunningTime="2025-11-26 15:07:33.545844554 +0000 UTC m=+1020.971592158" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.547322 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24pqd" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.548101 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24pqd" event={"ID":"2eca4c8f-cc45-46f6-8730-187af536d3b1","Type":"ContainerDied","Data":"e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a"} Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.548125 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e29f77cff424184b7d4443740d5ecda635dd26b2d81f6b4ff499c3142cda016a" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.634792 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d45c4597-qv4b7"] Nov 26 15:07:33 crc kubenswrapper[4651]: E1126 15:07:33.635149 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eca4c8f-cc45-46f6-8730-187af536d3b1" containerName="keystone-bootstrap" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.635162 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eca4c8f-cc45-46f6-8730-187af536d3b1" containerName="keystone-bootstrap" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.635342 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eca4c8f-cc45-46f6-8730-187af536d3b1" containerName="keystone-bootstrap" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.635875 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.646817 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.647049 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.647201 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.647820 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.647967 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.648264 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k89xc" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.655857 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d45c4597-qv4b7"] Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718415 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-public-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718475 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-combined-ca-bundle\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718511 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-internal-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718538 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-credential-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718569 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-scripts\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718678 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvbm\" (UniqueName: \"kubernetes.io/projected/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-kube-api-access-9gvbm\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718705 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-fernet-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.718788 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-config-data\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820775 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvbm\" (UniqueName: \"kubernetes.io/projected/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-kube-api-access-9gvbm\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820823 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-fernet-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820890 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-config-data\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820933 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-public-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820958 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-combined-ca-bundle\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.820984 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-internal-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.821008 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-credential-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.821048 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-scripts\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.827617 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-public-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.827798 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-fernet-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.831858 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-credential-keys\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.833570 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-scripts\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.833612 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-combined-ca-bundle\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.834755 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-internal-tls-certs\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.836347 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-config-data\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.845854 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvbm\" (UniqueName: \"kubernetes.io/projected/1c55f267-1433-4e39-9f2e-a2a8e038dbb5-kube-api-access-9gvbm\") pod \"keystone-7d45c4597-qv4b7\" (UID: \"1c55f267-1433-4e39-9f2e-a2a8e038dbb5\") " pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:33 crc kubenswrapper[4651]: I1126 15:07:33.994489 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.104408 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d7dcdb968-2bhkx"] Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.178466 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.179839 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.184016 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.235560 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.235677 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.238469 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.625476 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wzxcr" event={"ID":"0b39efce-2985-4f46-91a2-bb397f605c9c","Type":"ContainerStarted","Data":"24b0058da5b36097879a9dd3bfbd2e2aa5d0acde3fd564408286e8951f80181e"} Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.649487 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d7dcdb968-2bhkx" event={"ID":"50553d30-1881-42f4-9e57-224db8e5be3c","Type":"ContainerStarted","Data":"39b0af6d228a1932fe6e7fe3d5bf7287d2418430a1c631de667f99f4b20ed078"} Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.683250 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wzxcr" podStartSLOduration=4.892861537 podStartE2EDuration="55.6832296s" podCreationTimestamp="2025-11-26 15:06:39 +0000 UTC" firstStartedPulling="2025-11-26 15:06:42.130167085 +0000 UTC m=+969.555914699" lastFinishedPulling="2025-11-26 15:07:32.920535158 +0000 UTC m=+1020.346282762" observedRunningTime="2025-11-26 15:07:34.653684749 +0000 UTC m=+1022.079432353" watchObservedRunningTime="2025-11-26 15:07:34.6832296 +0000 UTC m=+1022.108977204" Nov 26 15:07:34 crc kubenswrapper[4651]: I1126 15:07:34.788616 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d45c4597-qv4b7"] Nov 26 15:07:34 crc kubenswrapper[4651]: W1126 15:07:34.799164 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c55f267_1433_4e39_9f2e_a2a8e038dbb5.slice/crio-df46c937fe8e422e1b1f4d4574c689c47c588a4a9b77dd7120717619a41b7e20 WatchSource:0}: Error finding container df46c937fe8e422e1b1f4d4574c689c47c588a4a9b77dd7120717619a41b7e20: Status 404 returned error can't find the container with id df46c937fe8e422e1b1f4d4574c689c47c588a4a9b77dd7120717619a41b7e20 Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.659287 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d45c4597-qv4b7" event={"ID":"1c55f267-1433-4e39-9f2e-a2a8e038dbb5","Type":"ContainerStarted","Data":"83873cc7e19eb39e0deb9e7a0de2f39e27ac37dba3a2bb36a7365073bc535fb9"} Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.659833 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d45c4597-qv4b7" event={"ID":"1c55f267-1433-4e39-9f2e-a2a8e038dbb5","Type":"ContainerStarted","Data":"df46c937fe8e422e1b1f4d4574c689c47c588a4a9b77dd7120717619a41b7e20"} Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.659850 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.663759 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d7dcdb968-2bhkx" event={"ID":"50553d30-1881-42f4-9e57-224db8e5be3c","Type":"ContainerStarted","Data":"5bcf6e868bb4f50f6899c8940b61e46ba87ab5b9d06bdb015f2d7567b7214266"} Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.663801 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d7dcdb968-2bhkx" event={"ID":"50553d30-1881-42f4-9e57-224db8e5be3c","Type":"ContainerStarted","Data":"be7bbcd37407065d1c17b881029f9f4d926c35b002088afc430d736c3e928680"} Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.664200 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.664261 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.684665 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d45c4597-qv4b7" podStartSLOduration=2.684647796 podStartE2EDuration="2.684647796s" podCreationTimestamp="2025-11-26 15:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:35.679012802 +0000 UTC m=+1023.104760406" watchObservedRunningTime="2025-11-26 15:07:35.684647796 +0000 UTC m=+1023.110395400" Nov 26 15:07:35 crc kubenswrapper[4651]: I1126 15:07:35.705441 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d7dcdb968-2bhkx" podStartSLOduration=2.705421056 podStartE2EDuration="2.705421056s" podCreationTimestamp="2025-11-26 15:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:35.702230968 +0000 UTC m=+1023.127978582" watchObservedRunningTime="2025-11-26 15:07:35.705421056 +0000 UTC m=+1023.131168670" Nov 26 15:07:39 crc kubenswrapper[4651]: I1126 15:07:39.612873 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:07:39 crc kubenswrapper[4651]: I1126 15:07:39.782341 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:07:40 crc kubenswrapper[4651]: I1126 15:07:40.708498 4651 generic.go:334] "Generic (PLEG): container finished" podID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" containerID="6d1911b8e56de816e038b6661a78c7991b66e0a55e28c0d04c6a75bd20e26e83" exitCode=0 Nov 26 15:07:40 crc kubenswrapper[4651]: I1126 15:07:40.708550 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p6s6f" event={"ID":"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b","Type":"ContainerDied","Data":"6d1911b8e56de816e038b6661a78c7991b66e0a55e28c0d04c6a75bd20e26e83"} Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.353059 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.417247 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle\") pod \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.417413 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhsp\" (UniqueName: \"kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp\") pod \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.417576 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data\") pod \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\" (UID: \"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b\") " Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.423546 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp" (OuterVolumeSpecName: "kube-api-access-tkhsp") pod "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" (UID: "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b"). InnerVolumeSpecName "kube-api-access-tkhsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:43 crc kubenswrapper[4651]: E1126 15:07:43.432568 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openstack/swift-storage-0" podUID="a3b8c2db-ce7f-48ce-9fd1-d55b5583773e" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.446004 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" (UID: "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.449557 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" (UID: "81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.519493 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhsp\" (UniqueName: \"kubernetes.io/projected/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-kube-api-access-tkhsp\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.520390 4651 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.520410 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.738766 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-p6s6f" event={"ID":"81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b","Type":"ContainerDied","Data":"65cd413cfccece2e5db5616f1e32b1f1d2d0e01d5fa1548f4c51f4d7e31d59c2"} Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.738812 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65cd413cfccece2e5db5616f1e32b1f1d2d0e01d5fa1548f4c51f4d7e31d59c2" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.738812 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-p6s6f" Nov 26 15:07:43 crc kubenswrapper[4651]: I1126 15:07:43.738836 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 15:07:44 crc kubenswrapper[4651]: E1126 15:07:44.557113 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.642769 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7955b99f58-492wx"] Nov 26 15:07:44 crc kubenswrapper[4651]: E1126 15:07:44.643564 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" containerName="barbican-db-sync" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.643584 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" containerName="barbican-db-sync" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.652303 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" containerName="barbican-db-sync" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.653587 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.658648 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-848fcb696c-vfdpx"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.660551 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.660913 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.661100 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pv895" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.666159 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.679013 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.694888 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7955b99f58-492wx"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.715590 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-848fcb696c-vfdpx"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748002 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-combined-ca-bundle\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748082 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data-custom\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748250 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6gr\" (UniqueName: \"kubernetes.io/projected/c7e365e4-f902-439d-92e3-de43fd6ccdaf-kube-api-access-st6gr\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748341 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf420acf-d3d6-45d2-a484-66265c5a1bcd-logs\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748413 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748449 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e365e4-f902-439d-92e3-de43fd6ccdaf-logs\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748758 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-combined-ca-bundle\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748801 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data-custom\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748819 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxqn\" (UniqueName: \"kubernetes.io/projected/bf420acf-d3d6-45d2-a484-66265c5a1bcd-kube-api-access-rhxqn\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.748843 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.754989 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerStarted","Data":"d7231b8eea9cc66fdc7fad09bf4894ff40cedc164f8de53c60eb7bef61f58223"} Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.755173 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="ceilometer-notification-agent" containerID="cri-o://982ad101dcca888166edea02c1d706d541f1e1b6586983041bc2e91bcdd03cc4" gracePeriod=30 Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.755258 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.755370 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="sg-core" containerID="cri-o://3acf0890d832aef9e73def3a879e5bd074b7a482d3f77ce9eaebc8d3c9e6db46" gracePeriod=30 Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.755437 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="proxy-httpd" containerID="cri-o://d7231b8eea9cc66fdc7fad09bf4894ff40cedc164f8de53c60eb7bef61f58223" gracePeriod=30 Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.818433 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.819739 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850318 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850379 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-combined-ca-bundle\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850413 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850439 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data-custom\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850460 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxqn\" (UniqueName: \"kubernetes.io/projected/bf420acf-d3d6-45d2-a484-66265c5a1bcd-kube-api-access-rhxqn\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850482 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850525 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850582 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxplq\" (UniqueName: \"kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850607 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-combined-ca-bundle\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850644 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850670 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data-custom\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850693 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6gr\" (UniqueName: \"kubernetes.io/projected/c7e365e4-f902-439d-92e3-de43fd6ccdaf-kube-api-access-st6gr\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850720 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf420acf-d3d6-45d2-a484-66265c5a1bcd-logs\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850742 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.850764 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e365e4-f902-439d-92e3-de43fd6ccdaf-logs\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.851211 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e365e4-f902-439d-92e3-de43fd6ccdaf-logs\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.851761 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf420acf-d3d6-45d2-a484-66265c5a1bcd-logs\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.861024 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-combined-ca-bundle\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.861398 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data-custom\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.861709 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.862071 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-combined-ca-bundle\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.862074 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf420acf-d3d6-45d2-a484-66265c5a1bcd-config-data-custom\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.865396 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e365e4-f902-439d-92e3-de43fd6ccdaf-config-data\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.869854 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6gr\" (UniqueName: \"kubernetes.io/projected/c7e365e4-f902-439d-92e3-de43fd6ccdaf-kube-api-access-st6gr\") pod \"barbican-keystone-listener-7955b99f58-492wx\" (UID: \"c7e365e4-f902-439d-92e3-de43fd6ccdaf\") " pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.911619 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.926455 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxqn\" (UniqueName: \"kubernetes.io/projected/bf420acf-d3d6-45d2-a484-66265c5a1bcd-kube-api-access-rhxqn\") pod \"barbican-worker-848fcb696c-vfdpx\" (UID: \"bf420acf-d3d6-45d2-a484-66265c5a1bcd\") " pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.951851 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.951934 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxplq\" (UniqueName: \"kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.951964 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.952045 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.952078 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.952852 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.953391 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.954114 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.954641 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.986110 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.987542 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:44 crc kubenswrapper[4651]: I1126 15:07:44.991455 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:44.999529 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.002637 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-848fcb696c-vfdpx" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.004232 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.007250 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxplq\" (UniqueName: \"kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq\") pod \"dnsmasq-dns-7d649d8c65-pqjjg\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.053150 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.053204 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.053301 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jr5d\" (UniqueName: \"kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.055978 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.056075 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.149985 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.157661 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jr5d\" (UniqueName: \"kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.158030 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.158104 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.158182 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.158212 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.158748 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.164422 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.164981 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.165751 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.179742 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jr5d\" (UniqueName: \"kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d\") pod \"barbican-api-d8f64784b-d4hq5\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.329783 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.637954 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7955b99f58-492wx"] Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.748361 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-848fcb696c-vfdpx"] Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.803280 4651 generic.go:334] "Generic (PLEG): container finished" podID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerID="3acf0890d832aef9e73def3a879e5bd074b7a482d3f77ce9eaebc8d3c9e6db46" exitCode=2 Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.803387 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerDied","Data":"3acf0890d832aef9e73def3a879e5bd074b7a482d3f77ce9eaebc8d3c9e6db46"} Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.812118 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" event={"ID":"c7e365e4-f902-439d-92e3-de43fd6ccdaf","Type":"ContainerStarted","Data":"ec6d7e33972b1d454cf9381b4aa856668f65936479682e4e3a6d892d463dc8e3"} Nov 26 15:07:45 crc kubenswrapper[4651]: I1126 15:07:45.887237 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.111409 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:07:46 crc kubenswrapper[4651]: W1126 15:07:46.144127 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda68a3631_5c70_4319_844d_4c015bd0fe32.slice/crio-f0370053f9f04178b60b3f40b099790894ebeecdd7f5232354bc1eb4ca8dad14 WatchSource:0}: Error finding container f0370053f9f04178b60b3f40b099790894ebeecdd7f5232354bc1eb4ca8dad14: Status 404 returned error can't find the container with id f0370053f9f04178b60b3f40b099790894ebeecdd7f5232354bc1eb4ca8dad14 Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.867902 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerStarted","Data":"5efd66eb0c6328ebde61af93210afab889e30881a6820800c8e04b92b0db3099"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.868283 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerStarted","Data":"ff10f07f66032ee801ad18eea83cde81a45d9b921c8a3d1057163322eaa055a6"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.868292 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerStarted","Data":"f0370053f9f04178b60b3f40b099790894ebeecdd7f5232354bc1eb4ca8dad14"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.868636 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.868655 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.882973 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848fcb696c-vfdpx" event={"ID":"bf420acf-d3d6-45d2-a484-66265c5a1bcd","Type":"ContainerStarted","Data":"37fe36e3f7f573f50d486a047b976ca302b024e3d04fe697951ced6c4aeead76"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.887368 4651 generic.go:334] "Generic (PLEG): container finished" podID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerID="07bb429a3ec0af424619c9fa6946a0f2ce1766191bfedf24ff109bfff5814d19" exitCode=0 Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.887485 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" event={"ID":"4953a14e-2b1f-4cdb-b5c3-92edede693f1","Type":"ContainerDied","Data":"07bb429a3ec0af424619c9fa6946a0f2ce1766191bfedf24ff109bfff5814d19"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.887516 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" event={"ID":"4953a14e-2b1f-4cdb-b5c3-92edede693f1","Type":"ContainerStarted","Data":"4f192f382025bcdbac7a58ac98ceeccdbe3995bd451a7c941f970f334e6c615d"} Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.900637 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d8f64784b-d4hq5" podStartSLOduration=2.900619333 podStartE2EDuration="2.900619333s" podCreationTimestamp="2025-11-26 15:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:46.890454694 +0000 UTC m=+1034.316202328" watchObservedRunningTime="2025-11-26 15:07:46.900619333 +0000 UTC m=+1034.326366937" Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.904913 4651 generic.go:334] "Generic (PLEG): container finished" podID="0b39efce-2985-4f46-91a2-bb397f605c9c" containerID="24b0058da5b36097879a9dd3bfbd2e2aa5d0acde3fd564408286e8951f80181e" exitCode=0 Nov 26 15:07:46 crc kubenswrapper[4651]: I1126 15:07:46.904958 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wzxcr" event={"ID":"0b39efce-2985-4f46-91a2-bb397f605c9c","Type":"ContainerDied","Data":"24b0058da5b36097879a9dd3bfbd2e2aa5d0acde3fd564408286e8951f80181e"} Nov 26 15:07:47 crc kubenswrapper[4651]: I1126 15:07:47.934621 4651 generic.go:334] "Generic (PLEG): container finished" podID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerID="982ad101dcca888166edea02c1d706d541f1e1b6586983041bc2e91bcdd03cc4" exitCode=0 Nov 26 15:07:47 crc kubenswrapper[4651]: I1126 15:07:47.934813 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerDied","Data":"982ad101dcca888166edea02c1d706d541f1e1b6586983041bc2e91bcdd03cc4"} Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.243857 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:07:48 crc kubenswrapper[4651]: E1126 15:07:48.244257 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:07:48 crc kubenswrapper[4651]: E1126 15:07:48.244279 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:07:48 crc kubenswrapper[4651]: E1126 15:07:48.244353 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:09:50.244334005 +0000 UTC m=+1157.670081609 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.325901 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5688f744d6-ck9mn"] Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.327857 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.332972 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.333401 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.339831 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5688f744d6-ck9mn"] Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447104 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-public-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447406 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38335944-e310-47d9-b2c1-c6f931134e10-logs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447466 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-internal-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447553 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnflp\" (UniqueName: \"kubernetes.io/projected/38335944-e310-47d9-b2c1-c6f931134e10-kube-api-access-gnflp\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447691 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data-custom\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447726 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-combined-ca-bundle\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.447849 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549681 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549749 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-public-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549852 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38335944-e310-47d9-b2c1-c6f931134e10-logs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549874 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-internal-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549901 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnflp\" (UniqueName: \"kubernetes.io/projected/38335944-e310-47d9-b2c1-c6f931134e10-kube-api-access-gnflp\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549943 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data-custom\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.549957 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-combined-ca-bundle\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.550632 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38335944-e310-47d9-b2c1-c6f931134e10-logs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.556417 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-internal-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.558174 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-combined-ca-bundle\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.559620 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data-custom\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.571380 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-public-tls-certs\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.575177 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38335944-e310-47d9-b2c1-c6f931134e10-config-data\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.577746 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnflp\" (UniqueName: \"kubernetes.io/projected/38335944-e310-47d9-b2c1-c6f931134e10-kube-api-access-gnflp\") pod \"barbican-api-5688f744d6-ck9mn\" (UID: \"38335944-e310-47d9-b2c1-c6f931134e10\") " pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.661302 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.767130 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854572 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854627 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854752 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sff7j\" (UniqueName: \"kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854805 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854852 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.854867 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle\") pod \"0b39efce-2985-4f46-91a2-bb397f605c9c\" (UID: \"0b39efce-2985-4f46-91a2-bb397f605c9c\") " Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.861110 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.867185 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts" (OuterVolumeSpecName: "scripts") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.890302 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j" (OuterVolumeSpecName: "kube-api-access-sff7j") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "kube-api-access-sff7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.896553 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.987582 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.987619 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sff7j\" (UniqueName: \"kubernetes.io/projected/0b39efce-2985-4f46-91a2-bb397f605c9c-kube-api-access-sff7j\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.987634 4651 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:48 crc kubenswrapper[4651]: I1126 15:07:48.987647 4651 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b39efce-2985-4f46-91a2-bb397f605c9c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.084241 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.093413 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.240640 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" event={"ID":"4953a14e-2b1f-4cdb-b5c3-92edede693f1","Type":"ContainerStarted","Data":"291b200c9699be2b0912666e63aa17a5b9d7b53776d26f77c55c038bed9e707c"} Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.242297 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.267312 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data" (OuterVolumeSpecName: "config-data") pod "0b39efce-2985-4f46-91a2-bb397f605c9c" (UID: "0b39efce-2985-4f46-91a2-bb397f605c9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.297229 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b39efce-2985-4f46-91a2-bb397f605c9c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.315481 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wzxcr" event={"ID":"0b39efce-2985-4f46-91a2-bb397f605c9c","Type":"ContainerDied","Data":"279b8117b34cd68b30ec015e5a18d3e2e9bfe2da47c31c7b5439909a17b14e13"} Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.315531 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="279b8117b34cd68b30ec015e5a18d3e2e9bfe2da47c31c7b5439909a17b14e13" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.315628 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wzxcr" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.344381 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" podStartSLOduration=5.344366236 podStartE2EDuration="5.344366236s" podCreationTimestamp="2025-11-26 15:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:49.344208752 +0000 UTC m=+1036.769956356" watchObservedRunningTime="2025-11-26 15:07:49.344366236 +0000 UTC m=+1036.770113840" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.565276 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5688f744d6-ck9mn"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.616683 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.618694 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.717108 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:07:49 crc kubenswrapper[4651]: E1126 15:07:49.717861 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" containerName="cinder-db-sync" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.717876 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" containerName="cinder-db-sync" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.718233 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" containerName="cinder-db-sync" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.739753 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.745364 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.745557 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q7r49" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.746919 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.751820 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.753306 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.754421 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.828306 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.837816 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.838759 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78x5l\" (UniqueName: \"kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.838804 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.838827 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.838872 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.838980 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntppr\" (UniqueName: \"kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839067 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839169 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839234 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839265 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839303 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.839465 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.864886 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:07:49 crc kubenswrapper[4651]: E1126 15:07:49.877899 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b39efce_2985_4f46_91a2_bb397f605c9c.slice\": RecentStats: unable to find data in memory cache]" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.884113 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.885757 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.887915 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.908369 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.940923 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.940987 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78x5l\" (UniqueName: \"kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941027 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941066 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941088 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnzr2\" (UniqueName: \"kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941143 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941218 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941242 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntppr\" (UniqueName: \"kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941261 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941317 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941335 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941384 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941402 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941421 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941461 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941481 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941501 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.941537 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.943420 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.944477 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.950103 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.950450 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.954188 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.966733 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.967504 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78x5l\" (UniqueName: \"kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.967694 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.983785 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntppr\" (UniqueName: \"kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr\") pod \"dnsmasq-dns-57fff66767-89s67\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:49 crc kubenswrapper[4651]: I1126 15:07:49.995690 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:49.999866 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " pod="openstack/cinder-scheduler-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043361 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043752 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnzr2\" (UniqueName: \"kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043838 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043884 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043928 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043965 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.043999 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.047548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.047548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.051300 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.051326 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.051800 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.052409 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.073143 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnzr2\" (UniqueName: \"kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2\") pod \"cinder-api-0\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.168028 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.170909 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.219297 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.240493 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.368345 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5688f744d6-ck9mn" event={"ID":"38335944-e310-47d9-b2c1-c6f931134e10","Type":"ContainerStarted","Data":"c6929cad5ae9b4f4578c97dc2a1e233ed8ac154939680d366f1e2a15eb9116c3"} Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.368621 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5688f744d6-ck9mn" event={"ID":"38335944-e310-47d9-b2c1-c6f931134e10","Type":"ContainerStarted","Data":"e36deec0d66432918ed8da14a8295721c33743384bec3bfeae74ac4158d2f73e"} Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.399655 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" event={"ID":"c7e365e4-f902-439d-92e3-de43fd6ccdaf","Type":"ContainerStarted","Data":"f08172c03b54fb281e583b95b48da6442de71d4f4fb22c77d8d32b63591f96d7"} Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.423183 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848fcb696c-vfdpx" event={"ID":"bf420acf-d3d6-45d2-a484-66265c5a1bcd","Type":"ContainerStarted","Data":"e294ffbdda17fd953a81224a257900953942133fe570943cc4379119fbd667d5"} Nov 26 15:07:50 crc kubenswrapper[4651]: I1126 15:07:50.828597 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.081552 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.101162 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:07:51 crc kubenswrapper[4651]: W1126 15:07:51.112118 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615ee6d9_0216_4f0a_b9ea_579fc268806e.slice/crio-135a6a74852445d41dad7330fcd50167b9617fae108173f96b238547fd3d2a37 WatchSource:0}: Error finding container 135a6a74852445d41dad7330fcd50167b9617fae108173f96b238547fd3d2a37: Status 404 returned error can't find the container with id 135a6a74852445d41dad7330fcd50167b9617fae108173f96b238547fd3d2a37 Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.484983 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5688f744d6-ck9mn" event={"ID":"38335944-e310-47d9-b2c1-c6f931134e10","Type":"ContainerStarted","Data":"09f037c4b6f154e2f7ea500712ec8b99f025028208377691899514c64aaad0e2"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.486627 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.486668 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.491695 4651 generic.go:334] "Generic (PLEG): container finished" podID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerID="1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9" exitCode=0 Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.491757 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-89s67" event={"ID":"615ee6d9-0216-4f0a-b9ea-579fc268806e","Type":"ContainerDied","Data":"1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.491782 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-89s67" event={"ID":"615ee6d9-0216-4f0a-b9ea-579fc268806e","Type":"ContainerStarted","Data":"135a6a74852445d41dad7330fcd50167b9617fae108173f96b238547fd3d2a37"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.526468 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerStarted","Data":"320965a4f215576ca457054bdd1f64d1aacc7606a0c28cd76d32d5278b611a50"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.527789 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5688f744d6-ck9mn" podStartSLOduration=3.5277656029999998 podStartE2EDuration="3.527765603s" podCreationTimestamp="2025-11-26 15:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:51.518543798 +0000 UTC m=+1038.944291402" watchObservedRunningTime="2025-11-26 15:07:51.527765603 +0000 UTC m=+1038.953513207" Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.535103 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" event={"ID":"c7e365e4-f902-439d-92e3-de43fd6ccdaf","Type":"ContainerStarted","Data":"e090c1323e8fee448f9892a42794f967667f429df3d935ce76ff832105833e70"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.555289 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848fcb696c-vfdpx" event={"ID":"bf420acf-d3d6-45d2-a484-66265c5a1bcd","Type":"ContainerStarted","Data":"459a150529400385e5000932b0ed16c5829f00501524d39fae55e1347a15a5f0"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.577579 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="dnsmasq-dns" containerID="cri-o://291b200c9699be2b0912666e63aa17a5b9d7b53776d26f77c55c038bed9e707c" gracePeriod=10 Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.577894 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerStarted","Data":"d1e1b5ecb9aa2e190066e07799df6e1a4c015456d67289149b7b6b2f20f9527f"} Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.612620 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7955b99f58-492wx" podStartSLOduration=4.531953638 podStartE2EDuration="7.612593505s" podCreationTimestamp="2025-11-26 15:07:44 +0000 UTC" firstStartedPulling="2025-11-26 15:07:45.635912966 +0000 UTC m=+1033.061660570" lastFinishedPulling="2025-11-26 15:07:48.716552833 +0000 UTC m=+1036.142300437" observedRunningTime="2025-11-26 15:07:51.600356047 +0000 UTC m=+1039.026103651" watchObservedRunningTime="2025-11-26 15:07:51.612593505 +0000 UTC m=+1039.038341109" Nov 26 15:07:51 crc kubenswrapper[4651]: I1126 15:07:51.631432 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-848fcb696c-vfdpx" podStartSLOduration=4.723874165 podStartE2EDuration="7.631409145s" podCreationTimestamp="2025-11-26 15:07:44 +0000 UTC" firstStartedPulling="2025-11-26 15:07:45.764144082 +0000 UTC m=+1033.189891686" lastFinishedPulling="2025-11-26 15:07:48.671679062 +0000 UTC m=+1036.097426666" observedRunningTime="2025-11-26 15:07:51.580717234 +0000 UTC m=+1039.006464838" watchObservedRunningTime="2025-11-26 15:07:51.631409145 +0000 UTC m=+1039.057156759" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.627070 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-89s67" event={"ID":"615ee6d9-0216-4f0a-b9ea-579fc268806e","Type":"ContainerStarted","Data":"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34"} Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.627826 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.637697 4651 generic.go:334] "Generic (PLEG): container finished" podID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerID="291b200c9699be2b0912666e63aa17a5b9d7b53776d26f77c55c038bed9e707c" exitCode=0 Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.637750 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" event={"ID":"4953a14e-2b1f-4cdb-b5c3-92edede693f1","Type":"ContainerDied","Data":"291b200c9699be2b0912666e63aa17a5b9d7b53776d26f77c55c038bed9e707c"} Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.654456 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerStarted","Data":"072bf1f7383a4f1f50ce9b866e5b0f644df618df715f8dd2f1c15dc4c51b378e"} Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.660924 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.673821 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57fff66767-89s67" podStartSLOduration=3.673799326 podStartE2EDuration="3.673799326s" podCreationTimestamp="2025-11-26 15:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:52.660094838 +0000 UTC m=+1040.085842442" watchObservedRunningTime="2025-11-26 15:07:52.673799326 +0000 UTC m=+1040.099546930" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.761877 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxplq\" (UniqueName: \"kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq\") pod \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.762091 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb\") pod \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.762137 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb\") pod \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.762241 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc\") pod \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.762291 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config\") pod \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\" (UID: \"4953a14e-2b1f-4cdb-b5c3-92edede693f1\") " Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.768009 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq" (OuterVolumeSpecName: "kube-api-access-qxplq") pod "4953a14e-2b1f-4cdb-b5c3-92edede693f1" (UID: "4953a14e-2b1f-4cdb-b5c3-92edede693f1"). InnerVolumeSpecName "kube-api-access-qxplq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.866146 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxplq\" (UniqueName: \"kubernetes.io/projected/4953a14e-2b1f-4cdb-b5c3-92edede693f1-kube-api-access-qxplq\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.887227 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.936925 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4953a14e-2b1f-4cdb-b5c3-92edede693f1" (UID: "4953a14e-2b1f-4cdb-b5c3-92edede693f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.942566 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config" (OuterVolumeSpecName: "config") pod "4953a14e-2b1f-4cdb-b5c3-92edede693f1" (UID: "4953a14e-2b1f-4cdb-b5c3-92edede693f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.977351 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:52 crc kubenswrapper[4651]: I1126 15:07:52.977394 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.013790 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4953a14e-2b1f-4cdb-b5c3-92edede693f1" (UID: "4953a14e-2b1f-4cdb-b5c3-92edede693f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.027793 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4953a14e-2b1f-4cdb-b5c3-92edede693f1" (UID: "4953a14e-2b1f-4cdb-b5c3-92edede693f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.079303 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.079340 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4953a14e-2b1f-4cdb-b5c3-92edede693f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.686413 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" event={"ID":"4953a14e-2b1f-4cdb-b5c3-92edede693f1","Type":"ContainerDied","Data":"4f192f382025bcdbac7a58ac98ceeccdbe3995bd451a7c941f970f334e6c615d"} Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.686740 4651 scope.go:117] "RemoveContainer" containerID="291b200c9699be2b0912666e63aa17a5b9d7b53776d26f77c55c038bed9e707c" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.686629 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-pqjjg" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.714848 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerStarted","Data":"088a196db0a48e953a7c9261491a46faca353ccb69047db1497ecdcca7c986f1"} Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.714880 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api" containerID="cri-o://088a196db0a48e953a7c9261491a46faca353ccb69047db1497ecdcca7c986f1" gracePeriod=30 Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.714856 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api-log" containerID="cri-o://072bf1f7383a4f1f50ce9b866e5b0f644df618df715f8dd2f1c15dc4c51b378e" gracePeriod=30 Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.714935 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.730187 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.735746 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-pqjjg"] Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.742179 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerStarted","Data":"a320b69f43bf26bc8e012ecde13f01134c4f7e45041dba7e6f31171e2637dba4"} Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.749442 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.749421835 podStartE2EDuration="4.749421835s" podCreationTimestamp="2025-11-26 15:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:07:53.736651642 +0000 UTC m=+1041.162399256" watchObservedRunningTime="2025-11-26 15:07:53.749421835 +0000 UTC m=+1041.175169439" Nov 26 15:07:53 crc kubenswrapper[4651]: I1126 15:07:53.760180 4651 scope.go:117] "RemoveContainer" containerID="07bb429a3ec0af424619c9fa6946a0f2ce1766191bfedf24ff109bfff5814d19" Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.036380 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5659bd5cb9-6wbmd" Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.167611 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.167826 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-746685dd-k8lhz" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-api" containerID="cri-o://20a18f2c3458a1d3cd6204c7a47f48a52e3c0eefd1b0c6fd4c15555e3a6d187c" gracePeriod=30 Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.168202 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-746685dd-k8lhz" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-httpd" containerID="cri-o://60fb8efc9bfc84f98cb543e4dfb966eb4ec00c9ba44ef1e538e94c292bc1b069" gracePeriod=30 Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.753340 4651 generic.go:334] "Generic (PLEG): container finished" podID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerID="072bf1f7383a4f1f50ce9b866e5b0f644df618df715f8dd2f1c15dc4c51b378e" exitCode=143 Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.754604 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerDied","Data":"072bf1f7383a4f1f50ce9b866e5b0f644df618df715f8dd2f1c15dc4c51b378e"} Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.757077 4651 generic.go:334] "Generic (PLEG): container finished" podID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerID="60fb8efc9bfc84f98cb543e4dfb966eb4ec00c9ba44ef1e538e94c292bc1b069" exitCode=0 Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.757139 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerDied","Data":"60fb8efc9bfc84f98cb543e4dfb966eb4ec00c9ba44ef1e538e94c292bc1b069"} Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.759268 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerStarted","Data":"a717f34c1d73be15a5aab889e8b38d781cfbccf61df2b0254d4a146c14b51c1a"} Nov 26 15:07:54 crc kubenswrapper[4651]: I1126 15:07:54.797943 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.495238012 podStartE2EDuration="5.797922785s" podCreationTimestamp="2025-11-26 15:07:49 +0000 UTC" firstStartedPulling="2025-11-26 15:07:50.87447254 +0000 UTC m=+1038.300220144" lastFinishedPulling="2025-11-26 15:07:52.177157313 +0000 UTC m=+1039.602904917" observedRunningTime="2025-11-26 15:07:54.788644209 +0000 UTC m=+1042.214391833" watchObservedRunningTime="2025-11-26 15:07:54.797922785 +0000 UTC m=+1042.223670389" Nov 26 15:07:55 crc kubenswrapper[4651]: I1126 15:07:55.168879 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 15:07:55 crc kubenswrapper[4651]: I1126 15:07:55.414379 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" path="/var/lib/kubelet/pods/4953a14e-2b1f-4cdb-b5c3-92edede693f1/volumes" Nov 26 15:07:57 crc kubenswrapper[4651]: I1126 15:07:57.625467 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:57 crc kubenswrapper[4651]: I1126 15:07:57.679574 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:07:57 crc kubenswrapper[4651]: I1126 15:07:57.800593 4651 generic.go:334] "Generic (PLEG): container finished" podID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerID="20a18f2c3458a1d3cd6204c7a47f48a52e3c0eefd1b0c6fd4c15555e3a6d187c" exitCode=0 Nov 26 15:07:57 crc kubenswrapper[4651]: I1126 15:07:57.800688 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerDied","Data":"20a18f2c3458a1d3cd6204c7a47f48a52e3c0eefd1b0c6fd4c15555e3a6d187c"} Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.685359 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.782693 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs\") pod \"2f337ea2-409f-4a06-8115-16aa4137f6bd\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.782754 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config\") pod \"2f337ea2-409f-4a06-8115-16aa4137f6bd\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.782794 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hnxt\" (UniqueName: \"kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt\") pod \"2f337ea2-409f-4a06-8115-16aa4137f6bd\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.782828 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config\") pod \"2f337ea2-409f-4a06-8115-16aa4137f6bd\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.782852 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle\") pod \"2f337ea2-409f-4a06-8115-16aa4137f6bd\" (UID: \"2f337ea2-409f-4a06-8115-16aa4137f6bd\") " Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.800204 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt" (OuterVolumeSpecName: "kube-api-access-6hnxt") pod "2f337ea2-409f-4a06-8115-16aa4137f6bd" (UID: "2f337ea2-409f-4a06-8115-16aa4137f6bd"). InnerVolumeSpecName "kube-api-access-6hnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.821168 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2f337ea2-409f-4a06-8115-16aa4137f6bd" (UID: "2f337ea2-409f-4a06-8115-16aa4137f6bd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.838024 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-746685dd-k8lhz" event={"ID":"2f337ea2-409f-4a06-8115-16aa4137f6bd","Type":"ContainerDied","Data":"03a38354795026d12fe1593f8ce65d2cc234216b00e4bf3fac8e7cac7d88cb0d"} Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.838089 4651 scope.go:117] "RemoveContainer" containerID="60fb8efc9bfc84f98cb543e4dfb966eb4ec00c9ba44ef1e538e94c292bc1b069" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.838282 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-746685dd-k8lhz" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.884761 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hnxt\" (UniqueName: \"kubernetes.io/projected/2f337ea2-409f-4a06-8115-16aa4137f6bd-kube-api-access-6hnxt\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:58 crc kubenswrapper[4651]: I1126 15:07:58.884994 4651 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.000170 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f337ea2-409f-4a06-8115-16aa4137f6bd" (UID: "2f337ea2-409f-4a06-8115-16aa4137f6bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.067194 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2f337ea2-409f-4a06-8115-16aa4137f6bd" (UID: "2f337ea2-409f-4a06-8115-16aa4137f6bd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.096119 4651 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.096145 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.096969 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config" (OuterVolumeSpecName: "config") pod "2f337ea2-409f-4a06-8115-16aa4137f6bd" (UID: "2f337ea2-409f-4a06-8115-16aa4137f6bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.153189 4651 scope.go:117] "RemoveContainer" containerID="20a18f2c3458a1d3cd6204c7a47f48a52e3c0eefd1b0c6fd4c15555e3a6d187c" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.189186 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.194666 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-746685dd-k8lhz"] Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.199238 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f337ea2-409f-4a06-8115-16aa4137f6bd-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.420712 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" path="/var/lib/kubelet/pods/2f337ea2-409f-4a06-8115-16aa4137f6bd/volumes" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.612812 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.612899 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.613943 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394"} pod="openstack/horizon-6974b49b94-vzn8h" containerMessage="Container horizon failed startup probe, will be restarted" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.613979 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" containerID="cri-o://bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394" gracePeriod=30 Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.783651 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.783740 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.784643 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419"} pod="openstack/horizon-f54c7c77d-rx8gm" containerMessage="Container horizon failed startup probe, will be restarted" Nov 26 15:07:59 crc kubenswrapper[4651]: I1126 15:07:59.784672 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" containerID="cri-o://56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419" gracePeriod=30 Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.222267 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.301994 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.302259 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-zhts9" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="dnsmasq-dns" containerID="cri-o://1edbe76f7461d0e15bb50e3d25c46101540fe7fa380949172e2d3e161d96af11" gracePeriod=10 Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.808941 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.876963 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.892996 4651 generic.go:334] "Generic (PLEG): container finished" podID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerID="1edbe76f7461d0e15bb50e3d25c46101540fe7fa380949172e2d3e161d96af11" exitCode=0 Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.893250 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="cinder-scheduler" containerID="cri-o://a320b69f43bf26bc8e012ecde13f01134c4f7e45041dba7e6f31171e2637dba4" gracePeriod=30 Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.893588 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerDied","Data":"1edbe76f7461d0e15bb50e3d25c46101540fe7fa380949172e2d3e161d96af11"} Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.893620 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-zhts9" event={"ID":"0989eafe-2213-40fa-89b4-f4df03c3d934","Type":"ContainerDied","Data":"2a50bdad0e53b3cbc550079693e3b33fad9be72605f9976a3f3a069bd74e053d"} Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.893631 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a50bdad0e53b3cbc550079693e3b33fad9be72605f9976a3f3a069bd74e053d" Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.893862 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="probe" containerID="cri-o://a717f34c1d73be15a5aab889e8b38d781cfbccf61df2b0254d4a146c14b51c1a" gracePeriod=30 Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.915286 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.965824 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd96t\" (UniqueName: \"kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t\") pod \"0989eafe-2213-40fa-89b4-f4df03c3d934\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.965876 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config\") pod \"0989eafe-2213-40fa-89b4-f4df03c3d934\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.965962 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc\") pod \"0989eafe-2213-40fa-89b4-f4df03c3d934\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.965991 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb\") pod \"0989eafe-2213-40fa-89b4-f4df03c3d934\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.966098 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb\") pod \"0989eafe-2213-40fa-89b4-f4df03c3d934\" (UID: \"0989eafe-2213-40fa-89b4-f4df03c3d934\") " Nov 26 15:08:00 crc kubenswrapper[4651]: I1126 15:08:00.974174 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t" (OuterVolumeSpecName: "kube-api-access-kd96t") pod "0989eafe-2213-40fa-89b4-f4df03c3d934" (UID: "0989eafe-2213-40fa-89b4-f4df03c3d934"). InnerVolumeSpecName "kube-api-access-kd96t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.069492 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd96t\" (UniqueName: \"kubernetes.io/projected/0989eafe-2213-40fa-89b4-f4df03c3d934-kube-api-access-kd96t\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.070419 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0989eafe-2213-40fa-89b4-f4df03c3d934" (UID: "0989eafe-2213-40fa-89b4-f4df03c3d934"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.072500 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0989eafe-2213-40fa-89b4-f4df03c3d934" (UID: "0989eafe-2213-40fa-89b4-f4df03c3d934"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.098925 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0989eafe-2213-40fa-89b4-f4df03c3d934" (UID: "0989eafe-2213-40fa-89b4-f4df03c3d934"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.133471 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config" (OuterVolumeSpecName: "config") pod "0989eafe-2213-40fa-89b4-f4df03c3d934" (UID: "0989eafe-2213-40fa-89b4-f4df03c3d934"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.171665 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.171710 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.171721 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.171732 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0989eafe-2213-40fa-89b4-f4df03c3d934-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.900368 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-zhts9" Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.920921 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:08:01 crc kubenswrapper[4651]: I1126 15:08:01.927167 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-zhts9"] Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.150710 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.181844 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5688f744d6-ck9mn" Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.319324 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.319869 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d8f64784b-d4hq5" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api-log" containerID="cri-o://ff10f07f66032ee801ad18eea83cde81a45d9b921c8a3d1057163322eaa055a6" gracePeriod=30 Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.320373 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d8f64784b-d4hq5" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api" containerID="cri-o://5efd66eb0c6328ebde61af93210afab889e30881a6820800c8e04b92b0db3099" gracePeriod=30 Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.925632 4651 generic.go:334] "Generic (PLEG): container finished" podID="b1b53de3-9040-402f-af0f-3370cffae66f" containerID="a717f34c1d73be15a5aab889e8b38d781cfbccf61df2b0254d4a146c14b51c1a" exitCode=0 Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.925871 4651 generic.go:334] "Generic (PLEG): container finished" podID="b1b53de3-9040-402f-af0f-3370cffae66f" containerID="a320b69f43bf26bc8e012ecde13f01134c4f7e45041dba7e6f31171e2637dba4" exitCode=0 Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.925945 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerDied","Data":"a717f34c1d73be15a5aab889e8b38d781cfbccf61df2b0254d4a146c14b51c1a"} Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.925969 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerDied","Data":"a320b69f43bf26bc8e012ecde13f01134c4f7e45041dba7e6f31171e2637dba4"} Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.928786 4651 generic.go:334] "Generic (PLEG): container finished" podID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerID="ff10f07f66032ee801ad18eea83cde81a45d9b921c8a3d1057163322eaa055a6" exitCode=143 Nov 26 15:08:02 crc kubenswrapper[4651]: I1126 15:08:02.929113 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerDied","Data":"ff10f07f66032ee801ad18eea83cde81a45d9b921c8a3d1057163322eaa055a6"} Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.093151 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.278363 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.278521 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78x5l\" (UniqueName: \"kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.278692 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.279507 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.279571 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.279598 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom\") pod \"b1b53de3-9040-402f-af0f-3370cffae66f\" (UID: \"b1b53de3-9040-402f-af0f-3370cffae66f\") " Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.286111 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.291131 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l" (OuterVolumeSpecName: "kube-api-access-78x5l") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "kube-api-access-78x5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.305330 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.308133 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts" (OuterVolumeSpecName: "scripts") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.366149 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.382259 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78x5l\" (UniqueName: \"kubernetes.io/projected/b1b53de3-9040-402f-af0f-3370cffae66f-kube-api-access-78x5l\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.382293 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.382313 4651 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1b53de3-9040-402f-af0f-3370cffae66f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.382321 4651 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.382329 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.408204 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data" (OuterVolumeSpecName: "config-data") pod "b1b53de3-9040-402f-af0f-3370cffae66f" (UID: "b1b53de3-9040-402f-af0f-3370cffae66f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.422597 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" path="/var/lib/kubelet/pods/0989eafe-2213-40fa-89b4-f4df03c3d934/volumes" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.484280 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b53de3-9040-402f-af0f-3370cffae66f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.972526 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1b53de3-9040-402f-af0f-3370cffae66f","Type":"ContainerDied","Data":"d1e1b5ecb9aa2e190066e07799df6e1a4c015456d67289149b7b6b2f20f9527f"} Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.972808 4651 scope.go:117] "RemoveContainer" containerID="a717f34c1d73be15a5aab889e8b38d781cfbccf61df2b0254d4a146c14b51c1a" Nov 26 15:08:03 crc kubenswrapper[4651]: I1126 15:08:03.972986 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.024550 4651 scope.go:117] "RemoveContainer" containerID="a320b69f43bf26bc8e012ecde13f01134c4f7e45041dba7e6f31171e2637dba4" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.055949 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.075521 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.094010 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.095471 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="init" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.095498 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="init" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.095513 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-api" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.095520 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-api" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.095539 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-httpd" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.095546 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-httpd" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.097240 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097266 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.097303 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="cinder-scheduler" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097312 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="cinder-scheduler" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.097326 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="init" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097334 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="init" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.097346 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097354 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: E1126 15:08:04.097367 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="probe" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097375 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="probe" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097657 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="probe" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097685 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-httpd" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097703 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="4953a14e-2b1f-4cdb-b5c3-92edede693f1" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097712 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" containerName="cinder-scheduler" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097723 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f337ea2-409f-4a06-8115-16aa4137f6bd" containerName="neutron-api" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.097735 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0989eafe-2213-40fa-89b4-f4df03c3d934" containerName="dnsmasq-dns" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.099008 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.101746 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.135331 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200096 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200259 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e27ec632-c98c-4da3-a998-299e18d1bc99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200299 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200354 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2sb\" (UniqueName: \"kubernetes.io/projected/e27ec632-c98c-4da3-a998-299e18d1bc99-kube-api-access-hv2sb\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200402 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-scripts\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.200451 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301623 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e27ec632-c98c-4da3-a998-299e18d1bc99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301661 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301700 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2sb\" (UniqueName: \"kubernetes.io/projected/e27ec632-c98c-4da3-a998-299e18d1bc99-kube-api-access-hv2sb\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301727 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-scripts\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301724 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e27ec632-c98c-4da3-a998-299e18d1bc99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301765 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.301906 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.313990 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.324904 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.328652 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-scripts\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.328710 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e27ec632-c98c-4da3-a998-299e18d1bc99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.335753 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2sb\" (UniqueName: \"kubernetes.io/projected/e27ec632-c98c-4da3-a998-299e18d1bc99-kube-api-access-hv2sb\") pod \"cinder-scheduler-0\" (UID: \"e27ec632-c98c-4da3-a998-299e18d1bc99\") " pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.445691 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 26 15:08:04 crc kubenswrapper[4651]: I1126 15:08:04.687221 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 15:08:05 crc kubenswrapper[4651]: I1126 15:08:05.030814 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 26 15:08:05 crc kubenswrapper[4651]: I1126 15:08:05.418060 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b53de3-9040-402f-af0f-3370cffae66f" path="/var/lib/kubelet/pods/b1b53de3-9040-402f-af0f-3370cffae66f/volumes" Nov 26 15:08:05 crc kubenswrapper[4651]: I1126 15:08:05.601223 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d8f64784b-d4hq5" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:41988->10.217.0.160:9311: read: connection reset by peer" Nov 26 15:08:05 crc kubenswrapper[4651]: I1126 15:08:05.601223 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d8f64784b-d4hq5" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:41994->10.217.0.160:9311: read: connection reset by peer" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.016679 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.023931 4651 generic.go:334] "Generic (PLEG): container finished" podID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerID="5efd66eb0c6328ebde61af93210afab889e30881a6820800c8e04b92b0db3099" exitCode=0 Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.023987 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerDied","Data":"5efd66eb0c6328ebde61af93210afab889e30881a6820800c8e04b92b0db3099"} Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.025376 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e27ec632-c98c-4da3-a998-299e18d1bc99","Type":"ContainerStarted","Data":"13b15bf6260c68e4fcca05bb2a7a769fb97b5718855311724b6c9db2a5bed3ae"} Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.025394 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e27ec632-c98c-4da3-a998-299e18d1bc99","Type":"ContainerStarted","Data":"60a358c9b8d472b3c5673b0ccc7f46fb415dcfb1e2b62aea0ab092bd164bb756"} Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.118874 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.212241 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d7dcdb968-2bhkx" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.259639 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle\") pod \"a68a3631-5c70-4319-844d-4c015bd0fe32\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.259721 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jr5d\" (UniqueName: \"kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d\") pod \"a68a3631-5c70-4319-844d-4c015bd0fe32\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.259778 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data\") pod \"a68a3631-5c70-4319-844d-4c015bd0fe32\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.259806 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs\") pod \"a68a3631-5c70-4319-844d-4c015bd0fe32\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.259913 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom\") pod \"a68a3631-5c70-4319-844d-4c015bd0fe32\" (UID: \"a68a3631-5c70-4319-844d-4c015bd0fe32\") " Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.265812 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a68a3631-5c70-4319-844d-4c015bd0fe32" (UID: "a68a3631-5c70-4319-844d-4c015bd0fe32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.269802 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs" (OuterVolumeSpecName: "logs") pod "a68a3631-5c70-4319-844d-4c015bd0fe32" (UID: "a68a3631-5c70-4319-844d-4c015bd0fe32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.285285 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d" (OuterVolumeSpecName: "kube-api-access-2jr5d") pod "a68a3631-5c70-4319-844d-4c015bd0fe32" (UID: "a68a3631-5c70-4319-844d-4c015bd0fe32"). InnerVolumeSpecName "kube-api-access-2jr5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.333517 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a68a3631-5c70-4319-844d-4c015bd0fe32" (UID: "a68a3631-5c70-4319-844d-4c015bd0fe32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.361842 4651 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.361874 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.361885 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jr5d\" (UniqueName: \"kubernetes.io/projected/a68a3631-5c70-4319-844d-4c015bd0fe32-kube-api-access-2jr5d\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.361895 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68a3631-5c70-4319-844d-4c015bd0fe32-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.394687 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data" (OuterVolumeSpecName: "config-data") pod "a68a3631-5c70-4319-844d-4c015bd0fe32" (UID: "a68a3631-5c70-4319-844d-4c015bd0fe32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:06 crc kubenswrapper[4651]: I1126 15:08:06.465822 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68a3631-5c70-4319-844d-4c015bd0fe32-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.037084 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e27ec632-c98c-4da3-a998-299e18d1bc99","Type":"ContainerStarted","Data":"359a293c7b053455406b852b2da954db1c1fc755233b2c2214e7f51d072cb13d"} Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.045491 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d8f64784b-d4hq5" Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.049060 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d8f64784b-d4hq5" event={"ID":"a68a3631-5c70-4319-844d-4c015bd0fe32","Type":"ContainerDied","Data":"f0370053f9f04178b60b3f40b099790894ebeecdd7f5232354bc1eb4ca8dad14"} Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.049138 4651 scope.go:117] "RemoveContainer" containerID="5efd66eb0c6328ebde61af93210afab889e30881a6820800c8e04b92b0db3099" Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.084252 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.084199092 podStartE2EDuration="3.084199092s" podCreationTimestamp="2025-11-26 15:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:07.070662428 +0000 UTC m=+1054.496410052" watchObservedRunningTime="2025-11-26 15:08:07.084199092 +0000 UTC m=+1054.509946706" Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.103559 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.112266 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d8f64784b-d4hq5"] Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.117958 4651 scope.go:117] "RemoveContainer" containerID="ff10f07f66032ee801ad18eea83cde81a45d9b921c8a3d1057163322eaa055a6" Nov 26 15:08:07 crc kubenswrapper[4651]: I1126 15:08:07.415880 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" path="/var/lib/kubelet/pods/a68a3631-5c70-4319-844d-4c015bd0fe32/volumes" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.186558 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d45c4597-qv4b7" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.339934 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 26 15:08:08 crc kubenswrapper[4651]: E1126 15:08:08.340472 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api-log" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.340490 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api-log" Nov 26 15:08:08 crc kubenswrapper[4651]: E1126 15:08:08.340514 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.340522 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.340752 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api-log" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.340768 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68a3631-5c70-4319-844d-4c015bd0fe32" containerName="barbican-api" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.341544 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.346425 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bgfxc" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.346611 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.346724 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.347235 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.518597 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvr2l\" (UniqueName: \"kubernetes.io/projected/a13f0157-d7dd-46c4-86cf-7397655d1e83-kube-api-access-bvr2l\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.518668 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.518697 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.518784 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config-secret\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.620509 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config-secret\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.620678 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvr2l\" (UniqueName: \"kubernetes.io/projected/a13f0157-d7dd-46c4-86cf-7397655d1e83-kube-api-access-bvr2l\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.620738 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.620762 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.621556 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.626766 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.627733 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a13f0157-d7dd-46c4-86cf-7397655d1e83-openstack-config-secret\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.654604 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvr2l\" (UniqueName: \"kubernetes.io/projected/a13f0157-d7dd-46c4-86cf-7397655d1e83-kube-api-access-bvr2l\") pod \"openstackclient\" (UID: \"a13f0157-d7dd-46c4-86cf-7397655d1e83\") " pod="openstack/openstackclient" Nov 26 15:08:08 crc kubenswrapper[4651]: I1126 15:08:08.675871 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 26 15:08:09 crc kubenswrapper[4651]: I1126 15:08:09.185763 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 26 15:08:09 crc kubenswrapper[4651]: W1126 15:08:09.190557 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13f0157_d7dd_46c4_86cf_7397655d1e83.slice/crio-cc5d09e96a9c4fe0ede6b532047b7f385db628dea22c78c26d8585ea1ce327ec WatchSource:0}: Error finding container cc5d09e96a9c4fe0ede6b532047b7f385db628dea22c78c26d8585ea1ce327ec: Status 404 returned error can't find the container with id cc5d09e96a9c4fe0ede6b532047b7f385db628dea22c78c26d8585ea1ce327ec Nov 26 15:08:09 crc kubenswrapper[4651]: I1126 15:08:09.446668 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 26 15:08:10 crc kubenswrapper[4651]: I1126 15:08:10.069677 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a13f0157-d7dd-46c4-86cf-7397655d1e83","Type":"ContainerStarted","Data":"cc5d09e96a9c4fe0ede6b532047b7f385db628dea22c78c26d8585ea1ce327ec"} Nov 26 15:08:10 crc kubenswrapper[4651]: I1126 15:08:10.319817 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.164:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:08:10 crc kubenswrapper[4651]: I1126 15:08:10.508155 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.530639 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6978d54687-jsqtl"] Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.532675 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.535752 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.535964 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.536771 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.588527 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6978d54687-jsqtl"] Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.633944 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6zz\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-kube-api-access-rr6zz\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.633991 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-run-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634021 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-config-data\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634060 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-log-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634105 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-internal-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634159 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634177 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-public-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.634249 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-combined-ca-bundle\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736224 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-combined-ca-bundle\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736304 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6zz\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-kube-api-access-rr6zz\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736326 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-run-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736349 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-config-data\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736369 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-log-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736401 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-internal-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736451 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.736471 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-public-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.737144 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-log-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: E1126 15:08:13.737238 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:13 crc kubenswrapper[4651]: E1126 15:08:13.737254 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:13 crc kubenswrapper[4651]: E1126 15:08:13.737302 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:14.237286951 +0000 UTC m=+1061.663034555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.737825 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09fca043-ad27-4285-8894-522bc6cc68f4-run-httpd\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.745309 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-config-data\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.746322 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-combined-ca-bundle\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.746601 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-public-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.760439 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09fca043-ad27-4285-8894-522bc6cc68f4-internal-tls-certs\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:13 crc kubenswrapper[4651]: I1126 15:08:13.772801 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6zz\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-kube-api-access-rr6zz\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.244294 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:14 crc kubenswrapper[4651]: E1126 15:08:14.244511 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:14 crc kubenswrapper[4651]: E1126 15:08:14.244546 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:14 crc kubenswrapper[4651]: E1126 15:08:14.244606 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:15.244588578 +0000 UTC m=+1062.670336182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.500120 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jjrbk"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.501216 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.512904 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjrbk"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.549142 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.549313 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfs2\" (UniqueName: \"kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.601284 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-n2799"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.602971 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.622579 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e077-account-create-update-9ntsd"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.624007 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.627942 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.650949 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e077-account-create-update-9ntsd"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.652055 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.652149 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.652204 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfs2\" (UniqueName: \"kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.652282 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6dr\" (UniqueName: \"kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.653026 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.662129 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n2799"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.700754 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfs2\" (UniqueName: \"kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2\") pod \"nova-api-db-create-jjrbk\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.724445 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7k6w2"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.725615 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.729705 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7k6w2"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.753611 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.753677 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.753728 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2mnm\" (UniqueName: \"kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.753855 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6dr\" (UniqueName: \"kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.755070 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.801791 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6dr\" (UniqueName: \"kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr\") pod \"nova-cell0-db-create-n2799\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.820979 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.826791 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-77b2-account-create-update-v7shh"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.828488 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.839137 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.840750 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-77b2-account-create-update-v7shh"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.871061 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.871114 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2mnm\" (UniqueName: \"kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.871159 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.871220 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdkn\" (UniqueName: \"kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.871775 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.890193 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2mnm\" (UniqueName: \"kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm\") pod \"nova-api-e077-account-create-update-9ntsd\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.921200 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.972264 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.972338 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.972398 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwp4\" (UniqueName: \"kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.972422 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdkn\" (UniqueName: \"kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.975812 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.976807 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.990835 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-aa1b-account-create-update-zsh2r"] Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.991934 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:14 crc kubenswrapper[4651]: I1126 15:08:14.996246 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.013285 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-aa1b-account-create-update-zsh2r"] Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.023088 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdkn\" (UniqueName: \"kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn\") pod \"nova-cell1-db-create-7k6w2\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.063120 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.068485 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.073890 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.073974 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.074022 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwp4\" (UniqueName: \"kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.074106 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtm4\" (UniqueName: \"kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.074791 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.107257 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwp4\" (UniqueName: \"kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4\") pod \"nova-cell0-77b2-account-create-update-v7shh\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.133480 4651 generic.go:334] "Generic (PLEG): container finished" podID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerID="d7231b8eea9cc66fdc7fad09bf4894ff40cedc164f8de53c60eb7bef61f58223" exitCode=137 Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.133520 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerDied","Data":"d7231b8eea9cc66fdc7fad09bf4894ff40cedc164f8de53c60eb7bef61f58223"} Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.175288 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtm4\" (UniqueName: \"kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.175862 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.177508 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.221529 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtm4\" (UniqueName: \"kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4\") pod \"nova-cell1-aa1b-account-create-update-zsh2r\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.277391 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:15 crc kubenswrapper[4651]: E1126 15:08:15.277601 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:15 crc kubenswrapper[4651]: E1126 15:08:15.277615 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:15 crc kubenswrapper[4651]: E1126 15:08:15.277671 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:17.277642691 +0000 UTC m=+1064.703390295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.339691 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.369156 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.566615 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687614 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687671 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687747 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687782 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jrbh\" (UniqueName: \"kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687858 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687893 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.687953 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts\") pod \"34a40fec-099f-437f-b32a-2b81bf3b32f8\" (UID: \"34a40fec-099f-437f-b32a-2b81bf3b32f8\") " Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.690317 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.722694 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh" (OuterVolumeSpecName: "kube-api-access-9jrbh") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "kube-api-access-9jrbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.760860 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts" (OuterVolumeSpecName: "scripts") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.806156 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.806547 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jrbh\" (UniqueName: \"kubernetes.io/projected/34a40fec-099f-437f-b32a-2b81bf3b32f8-kube-api-access-9jrbh\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.806583 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.806593 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.884178 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.907821 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34a40fec-099f-437f-b32a-2b81bf3b32f8-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.907847 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.934218 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data" (OuterVolumeSpecName: "config-data") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.948369 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34a40fec-099f-437f-b32a-2b81bf3b32f8" (UID: "34a40fec-099f-437f-b32a-2b81bf3b32f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:15 crc kubenswrapper[4651]: I1126 15:08:15.999450 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n2799"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.009987 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.010019 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a40fec-099f-437f-b32a-2b81bf3b32f8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.025019 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e077-account-create-update-9ntsd"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.046834 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjrbk"] Nov 26 15:08:16 crc kubenswrapper[4651]: W1126 15:08:16.051143 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5053f0e2_7865_4be5_9601_2c69da731509.slice/crio-338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9 WatchSource:0}: Error finding container 338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9: Status 404 returned error can't find the container with id 338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9 Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.126272 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7k6w2"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.183715 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-aa1b-account-create-update-zsh2r"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.195237 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjrbk" event={"ID":"b5279318-bb6d-455f-96a5-d410d0468c6b","Type":"ContainerStarted","Data":"080769eb417f412ec9c8d130c4c4d9c9bdc3871b3a905ffa7c93c0977b9ea444"} Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.207226 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n2799" event={"ID":"5053f0e2-7865-4be5-9601-2c69da731509","Type":"ContainerStarted","Data":"338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9"} Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.215561 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7k6w2" event={"ID":"34e0710f-88c5-4a6a-96d5-97f4a934eeed","Type":"ContainerStarted","Data":"380efef30fd2a21fd38003aceee1b45022de4626ee25ab59d308d7f4f48961d5"} Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.220455 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-77b2-account-create-update-v7shh"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.232572 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34a40fec-099f-437f-b32a-2b81bf3b32f8","Type":"ContainerDied","Data":"4a21970cb8ddba9e1488e569bb522183c680b2b7bc3af08cf6af25170dcda8a1"} Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.232629 4651 scope.go:117] "RemoveContainer" containerID="d7231b8eea9cc66fdc7fad09bf4894ff40cedc164f8de53c60eb7bef61f58223" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.232791 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.236920 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e077-account-create-update-9ntsd" event={"ID":"cce0bade-306a-4aa8-bbff-a24a79d73e22","Type":"ContainerStarted","Data":"93103f792082689e5414404d8470bc19e270610bae030c40ef102c98be2071b9"} Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.311096 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.326743 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.337913 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:16 crc kubenswrapper[4651]: E1126 15:08:16.338292 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="ceilometer-notification-agent" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338306 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="ceilometer-notification-agent" Nov 26 15:08:16 crc kubenswrapper[4651]: E1126 15:08:16.338316 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="proxy-httpd" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338323 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="proxy-httpd" Nov 26 15:08:16 crc kubenswrapper[4651]: E1126 15:08:16.338337 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="sg-core" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338343 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="sg-core" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338510 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="ceilometer-notification-agent" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338528 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="sg-core" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.338539 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" containerName="proxy-httpd" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.340080 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.346701 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.346834 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.354494 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.521725 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.522525 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.522733 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5td\" (UniqueName: \"kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.522809 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.522960 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.523029 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.523191 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624657 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624697 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624749 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5td\" (UniqueName: \"kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624772 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624801 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624823 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.624850 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.625329 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.625345 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.631296 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.631506 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.632392 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.633979 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.642358 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5td\" (UniqueName: \"kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td\") pod \"ceilometer-0\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " pod="openstack/ceilometer-0" Nov 26 15:08:16 crc kubenswrapper[4651]: I1126 15:08:16.675449 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:17 crc kubenswrapper[4651]: I1126 15:08:17.337885 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:17 crc kubenswrapper[4651]: E1126 15:08:17.338056 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:17 crc kubenswrapper[4651]: E1126 15:08:17.338399 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:17 crc kubenswrapper[4651]: E1126 15:08:17.338448 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:21.338429801 +0000 UTC m=+1068.764177405 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:17 crc kubenswrapper[4651]: I1126 15:08:17.417056 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a40fec-099f-437f-b32a-2b81bf3b32f8" path="/var/lib/kubelet/pods/34a40fec-099f-437f-b32a-2b81bf3b32f8/volumes" Nov 26 15:08:21 crc kubenswrapper[4651]: I1126 15:08:21.441742 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:21 crc kubenswrapper[4651]: E1126 15:08:21.441931 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:21 crc kubenswrapper[4651]: E1126 15:08:21.443253 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:21 crc kubenswrapper[4651]: E1126 15:08:21.443321 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:29.443299411 +0000 UTC m=+1076.869047025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:23 crc kubenswrapper[4651]: W1126 15:08:23.762665 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc0af2b0_0d8a_488e_97d3_5956869cd9e9.slice/crio-7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed WatchSource:0}: Error finding container 7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed: Status 404 returned error can't find the container with id 7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed Nov 26 15:08:23 crc kubenswrapper[4651]: W1126 15:08:23.810414 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9edcf88_8f1d_419e_afd8_5e5861d5b5ad.slice/crio-b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52 WatchSource:0}: Error finding container b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52: Status 404 returned error can't find the container with id b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52 Nov 26 15:08:23 crc kubenswrapper[4651]: I1126 15:08:23.911781 4651 scope.go:117] "RemoveContainer" containerID="3acf0890d832aef9e73def3a879e5bd074b7a482d3f77ce9eaebc8d3c9e6db46" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.256283 4651 scope.go:117] "RemoveContainer" containerID="982ad101dcca888166edea02c1d706d541f1e1b6586983041bc2e91bcdd03cc4" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.367992 4651 generic.go:334] "Generic (PLEG): container finished" podID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerID="088a196db0a48e953a7c9261491a46faca353ccb69047db1497ecdcca7c986f1" exitCode=137 Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.368467 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerDied","Data":"088a196db0a48e953a7c9261491a46faca353ccb69047db1497ecdcca7c986f1"} Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.372226 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" event={"ID":"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad","Type":"ContainerStarted","Data":"b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52"} Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.385857 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" event={"ID":"dc0af2b0-0d8a-488e-97d3-5956869cd9e9","Type":"ContainerStarted","Data":"7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed"} Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.559272 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:24 crc kubenswrapper[4651]: W1126 15:08:24.575627 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a452fa_5305_40c6_a374_fb74225abd07.slice/crio-ff4fe0e54b44912660a9fa98dc06abfbdda5e1a71b25b0996c22853f426fe6a0 WatchSource:0}: Error finding container ff4fe0e54b44912660a9fa98dc06abfbdda5e1a71b25b0996c22853f426fe6a0: Status 404 returned error can't find the container with id ff4fe0e54b44912660a9fa98dc06abfbdda5e1a71b25b0996c22853f426fe6a0 Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.616349 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815018 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815392 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnzr2\" (UniqueName: \"kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815413 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815484 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815554 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815591 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.815626 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id\") pod \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\" (UID: \"cd3eea7b-b827-4648-a65f-9f8508f0f6c2\") " Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.817536 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs" (OuterVolumeSpecName: "logs") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.817684 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.843629 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2" (OuterVolumeSpecName: "kube-api-access-bnzr2") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "kube-api-access-bnzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.843961 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.848736 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts" (OuterVolumeSpecName: "scripts") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.895801 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.918385 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.919645 4651 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.920061 4651 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.920126 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.920204 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnzr2\" (UniqueName: \"kubernetes.io/projected/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-kube-api-access-bnzr2\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.920290 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:24 crc kubenswrapper[4651]: I1126 15:08:24.937223 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data" (OuterVolumeSpecName: "config-data") pod "cd3eea7b-b827-4648-a65f-9f8508f0f6c2" (UID: "cd3eea7b-b827-4648-a65f-9f8508f0f6c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.022281 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd3eea7b-b827-4648-a65f-9f8508f0f6c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.395605 4651 generic.go:334] "Generic (PLEG): container finished" podID="5053f0e2-7865-4be5-9601-2c69da731509" containerID="1a2c81fc4664a228015b39d5625de039f2eb37b136c86f7e2a4d9d34504fa25c" exitCode=0 Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.395666 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n2799" event={"ID":"5053f0e2-7865-4be5-9601-2c69da731509","Type":"ContainerDied","Data":"1a2c81fc4664a228015b39d5625de039f2eb37b136c86f7e2a4d9d34504fa25c"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.398249 4651 generic.go:334] "Generic (PLEG): container finished" podID="cce0bade-306a-4aa8-bbff-a24a79d73e22" containerID="ea59fba41340cffb7387d09140bf315eea250a2fc013b18c7979fa7e3833c91c" exitCode=0 Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.398306 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e077-account-create-update-9ntsd" event={"ID":"cce0bade-306a-4aa8-bbff-a24a79d73e22","Type":"ContainerDied","Data":"ea59fba41340cffb7387d09140bf315eea250a2fc013b18c7979fa7e3833c91c"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.399506 4651 generic.go:334] "Generic (PLEG): container finished" podID="b5279318-bb6d-455f-96a5-d410d0468c6b" containerID="5b4f2e94eeddb1f14656a8ccf11f737869c9fe1c77ec64146502e320a265190b" exitCode=0 Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.399533 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjrbk" event={"ID":"b5279318-bb6d-455f-96a5-d410d0468c6b","Type":"ContainerDied","Data":"5b4f2e94eeddb1f14656a8ccf11f737869c9fe1c77ec64146502e320a265190b"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.400558 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerStarted","Data":"ff4fe0e54b44912660a9fa98dc06abfbdda5e1a71b25b0996c22853f426fe6a0"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.416378 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" event={"ID":"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad","Type":"ContainerStarted","Data":"407b2ba4795ba4c8ffdb266d5b919b0403ad2cfe5961c060ec6298e1621c9aa3"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.416425 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7k6w2" event={"ID":"34e0710f-88c5-4a6a-96d5-97f4a934eeed","Type":"ContainerStarted","Data":"b3983f8284e08861786acdea87ae2b3035437f66338617f3ce18f76772a9aa6b"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.416443 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" event={"ID":"dc0af2b0-0d8a-488e-97d3-5956869cd9e9","Type":"ContainerStarted","Data":"dd620561a989ccabf95243d7d0c2da8791578a36ce2368d783ef2c81ba468fea"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.416457 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a13f0157-d7dd-46c4-86cf-7397655d1e83","Type":"ContainerStarted","Data":"0c6f615c3862b80143cf54e32b921386dcd89e4ffa8c968b52cb1652288e70e2"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.420295 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cd3eea7b-b827-4648-a65f-9f8508f0f6c2","Type":"ContainerDied","Data":"320965a4f215576ca457054bdd1f64d1aacc7606a0c28cd76d32d5278b611a50"} Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.420362 4651 scope.go:117] "RemoveContainer" containerID="088a196db0a48e953a7c9261491a46faca353ccb69047db1497ecdcca7c986f1" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.420666 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.465054 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.519763611 podStartE2EDuration="17.465015475s" podCreationTimestamp="2025-11-26 15:08:08 +0000 UTC" firstStartedPulling="2025-11-26 15:08:09.192777581 +0000 UTC m=+1056.618525185" lastFinishedPulling="2025-11-26 15:08:24.138029445 +0000 UTC m=+1071.563777049" observedRunningTime="2025-11-26 15:08:25.457798485 +0000 UTC m=+1072.883546099" watchObservedRunningTime="2025-11-26 15:08:25.465015475 +0000 UTC m=+1072.890763079" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.480897 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" podStartSLOduration=11.480878393 podStartE2EDuration="11.480878393s" podCreationTimestamp="2025-11-26 15:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:25.475292229 +0000 UTC m=+1072.901039833" watchObservedRunningTime="2025-11-26 15:08:25.480878393 +0000 UTC m=+1072.906626007" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.501451 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-7k6w2" podStartSLOduration=11.50143203 podStartE2EDuration="11.50143203s" podCreationTimestamp="2025-11-26 15:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:25.491323541 +0000 UTC m=+1072.917071155" watchObservedRunningTime="2025-11-26 15:08:25.50143203 +0000 UTC m=+1072.927179624" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.531346 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" podStartSLOduration=11.531329626 podStartE2EDuration="11.531329626s" podCreationTimestamp="2025-11-26 15:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:25.523996864 +0000 UTC m=+1072.949744468" watchObservedRunningTime="2025-11-26 15:08:25.531329626 +0000 UTC m=+1072.957077230" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.559980 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.569239 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.586284 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:08:25 crc kubenswrapper[4651]: E1126 15:08:25.586777 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api-log" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.586801 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api-log" Nov 26 15:08:25 crc kubenswrapper[4651]: E1126 15:08:25.586843 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.586851 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.587093 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api-log" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.587130 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" containerName="cinder-api" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.588356 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.590569 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.590685 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.594479 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.604495 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736420 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f93566af-1613-4989-a326-26aa8cc4447c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736492 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736527 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736568 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736648 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93566af-1613-4989-a326-26aa8cc4447c-logs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736678 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736731 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg8j8\" (UniqueName: \"kubernetes.io/projected/f93566af-1613-4989-a326-26aa8cc4447c-kube-api-access-jg8j8\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.736804 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-scripts\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.745614 4651 scope.go:117] "RemoveContainer" containerID="072bf1f7383a4f1f50ce9b866e5b0f644df618df715f8dd2f1c15dc4c51b378e" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838440 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-scripts\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838523 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f93566af-1613-4989-a326-26aa8cc4447c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838558 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838582 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838611 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838667 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93566af-1613-4989-a326-26aa8cc4447c-logs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838687 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838701 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.838721 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg8j8\" (UniqueName: \"kubernetes.io/projected/f93566af-1613-4989-a326-26aa8cc4447c-kube-api-access-jg8j8\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.840644 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f93566af-1613-4989-a326-26aa8cc4447c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.846178 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-scripts\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.850875 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93566af-1613-4989-a326-26aa8cc4447c-logs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.853297 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.854022 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.860221 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-config-data\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.863296 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.863828 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg8j8\" (UniqueName: \"kubernetes.io/projected/f93566af-1613-4989-a326-26aa8cc4447c-kube-api-access-jg8j8\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.864249 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f93566af-1613-4989-a326-26aa8cc4447c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f93566af-1613-4989-a326-26aa8cc4447c\") " pod="openstack/cinder-api-0" Nov 26 15:08:25 crc kubenswrapper[4651]: I1126 15:08:25.905281 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.410094 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.438537 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" containerID="407b2ba4795ba4c8ffdb266d5b919b0403ad2cfe5961c060ec6298e1621c9aa3" exitCode=0 Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.438608 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" event={"ID":"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad","Type":"ContainerDied","Data":"407b2ba4795ba4c8ffdb266d5b919b0403ad2cfe5961c060ec6298e1621c9aa3"} Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.447698 4651 generic.go:334] "Generic (PLEG): container finished" podID="34e0710f-88c5-4a6a-96d5-97f4a934eeed" containerID="b3983f8284e08861786acdea87ae2b3035437f66338617f3ce18f76772a9aa6b" exitCode=0 Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.447757 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7k6w2" event={"ID":"34e0710f-88c5-4a6a-96d5-97f4a934eeed","Type":"ContainerDied","Data":"b3983f8284e08861786acdea87ae2b3035437f66338617f3ce18f76772a9aa6b"} Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.463122 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f93566af-1613-4989-a326-26aa8cc4447c","Type":"ContainerStarted","Data":"8569074c06b17e62487d245c43751157e8dd66d6d7df947c20af731c6d12efeb"} Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.468459 4651 generic.go:334] "Generic (PLEG): container finished" podID="dc0af2b0-0d8a-488e-97d3-5956869cd9e9" containerID="dd620561a989ccabf95243d7d0c2da8791578a36ce2368d783ef2c81ba468fea" exitCode=0 Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.468722 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" event={"ID":"dc0af2b0-0d8a-488e-97d3-5956869cd9e9","Type":"ContainerDied","Data":"dd620561a989ccabf95243d7d0c2da8791578a36ce2368d783ef2c81ba468fea"} Nov 26 15:08:26 crc kubenswrapper[4651]: I1126 15:08:26.973469 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.011568 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.069676 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpfs2\" (UniqueName: \"kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2\") pod \"b5279318-bb6d-455f-96a5-d410d0468c6b\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.070644 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts\") pod \"b5279318-bb6d-455f-96a5-d410d0468c6b\" (UID: \"b5279318-bb6d-455f-96a5-d410d0468c6b\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.071770 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5279318-bb6d-455f-96a5-d410d0468c6b" (UID: "b5279318-bb6d-455f-96a5-d410d0468c6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.101635 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2" (OuterVolumeSpecName: "kube-api-access-hpfs2") pod "b5279318-bb6d-455f-96a5-d410d0468c6b" (UID: "b5279318-bb6d-455f-96a5-d410d0468c6b"). InnerVolumeSpecName "kube-api-access-hpfs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.173271 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpfs2\" (UniqueName: \"kubernetes.io/projected/b5279318-bb6d-455f-96a5-d410d0468c6b-kube-api-access-hpfs2\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.173301 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5279318-bb6d-455f-96a5-d410d0468c6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.187909 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.188619 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.274174 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts\") pod \"cce0bade-306a-4aa8-bbff-a24a79d73e22\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.274336 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts\") pod \"5053f0e2-7865-4be5-9601-2c69da731509\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.274361 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2mnm\" (UniqueName: \"kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm\") pod \"cce0bade-306a-4aa8-bbff-a24a79d73e22\" (UID: \"cce0bade-306a-4aa8-bbff-a24a79d73e22\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.274417 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6dr\" (UniqueName: \"kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr\") pod \"5053f0e2-7865-4be5-9601-2c69da731509\" (UID: \"5053f0e2-7865-4be5-9601-2c69da731509\") " Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.275453 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5053f0e2-7865-4be5-9601-2c69da731509" (UID: "5053f0e2-7865-4be5-9601-2c69da731509"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.275685 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cce0bade-306a-4aa8-bbff-a24a79d73e22" (UID: "cce0bade-306a-4aa8-bbff-a24a79d73e22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.302393 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm" (OuterVolumeSpecName: "kube-api-access-n2mnm") pod "cce0bade-306a-4aa8-bbff-a24a79d73e22" (UID: "cce0bade-306a-4aa8-bbff-a24a79d73e22"). InnerVolumeSpecName "kube-api-access-n2mnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.302630 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr" (OuterVolumeSpecName: "kube-api-access-hw6dr") pod "5053f0e2-7865-4be5-9601-2c69da731509" (UID: "5053f0e2-7865-4be5-9601-2c69da731509"). InnerVolumeSpecName "kube-api-access-hw6dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.376424 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5053f0e2-7865-4be5-9601-2c69da731509-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.376462 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2mnm\" (UniqueName: \"kubernetes.io/projected/cce0bade-306a-4aa8-bbff-a24a79d73e22-kube-api-access-n2mnm\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.376478 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw6dr\" (UniqueName: \"kubernetes.io/projected/5053f0e2-7865-4be5-9601-2c69da731509-kube-api-access-hw6dr\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.376490 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce0bade-306a-4aa8-bbff-a24a79d73e22-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.423836 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3eea7b-b827-4648-a65f-9f8508f0f6c2" path="/var/lib/kubelet/pods/cd3eea7b-b827-4648-a65f-9f8508f0f6c2/volumes" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.483593 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjrbk" event={"ID":"b5279318-bb6d-455f-96a5-d410d0468c6b","Type":"ContainerDied","Data":"080769eb417f412ec9c8d130c4c4d9c9bdc3871b3a905ffa7c93c0977b9ea444"} Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.483636 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="080769eb417f412ec9c8d130c4c4d9c9bdc3871b3a905ffa7c93c0977b9ea444" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.483706 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjrbk" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.491269 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerStarted","Data":"ee4ed6061ae9adf8b4897d19bde876823c213e57da3abb6f07a6ce1843d09122"} Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.500936 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n2799" event={"ID":"5053f0e2-7865-4be5-9601-2c69da731509","Type":"ContainerDied","Data":"338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9"} Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.500975 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338a1d362700324d24374548061cecb5723a63f9b4a4b16e0c6bd58fe22844d9" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.501063 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n2799" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.504936 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e077-account-create-update-9ntsd" event={"ID":"cce0bade-306a-4aa8-bbff-a24a79d73e22","Type":"ContainerDied","Data":"93103f792082689e5414404d8470bc19e270610bae030c40ef102c98be2071b9"} Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.504990 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93103f792082689e5414404d8470bc19e270610bae030c40ef102c98be2071b9" Nov 26 15:08:27 crc kubenswrapper[4651]: I1126 15:08:27.505166 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e077-account-create-update-9ntsd" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.080606 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.149683 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.157525 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.211529 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqwp4\" (UniqueName: \"kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4\") pod \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.211652 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts\") pod \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\" (UID: \"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.214237 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" (UID: "e9edcf88-8f1d-419e-afd8-5e5861d5b5ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.225627 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4" (OuterVolumeSpecName: "kube-api-access-cqwp4") pod "e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" (UID: "e9edcf88-8f1d-419e-afd8-5e5861d5b5ad"). InnerVolumeSpecName "kube-api-access-cqwp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.312760 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts\") pod \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.313012 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbtm4\" (UniqueName: \"kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4\") pod \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.314796 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts\") pod \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\" (UID: \"dc0af2b0-0d8a-488e-97d3-5956869cd9e9\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.314833 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdkn\" (UniqueName: \"kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn\") pod \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\" (UID: \"34e0710f-88c5-4a6a-96d5-97f4a934eeed\") " Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.315574 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqwp4\" (UniqueName: \"kubernetes.io/projected/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-kube-api-access-cqwp4\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.315570 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34e0710f-88c5-4a6a-96d5-97f4a934eeed" (UID: "34e0710f-88c5-4a6a-96d5-97f4a934eeed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.315586 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.316438 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc0af2b0-0d8a-488e-97d3-5956869cd9e9" (UID: "dc0af2b0-0d8a-488e-97d3-5956869cd9e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.316975 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4" (OuterVolumeSpecName: "kube-api-access-qbtm4") pod "dc0af2b0-0d8a-488e-97d3-5956869cd9e9" (UID: "dc0af2b0-0d8a-488e-97d3-5956869cd9e9"). InnerVolumeSpecName "kube-api-access-qbtm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.320879 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn" (OuterVolumeSpecName: "kube-api-access-6kdkn") pod "34e0710f-88c5-4a6a-96d5-97f4a934eeed" (UID: "34e0710f-88c5-4a6a-96d5-97f4a934eeed"). InnerVolumeSpecName "kube-api-access-6kdkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.422876 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbtm4\" (UniqueName: \"kubernetes.io/projected/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-kube-api-access-qbtm4\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.423092 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc0af2b0-0d8a-488e-97d3-5956869cd9e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.423152 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdkn\" (UniqueName: \"kubernetes.io/projected/34e0710f-88c5-4a6a-96d5-97f4a934eeed-kube-api-access-6kdkn\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.423206 4651 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e0710f-88c5-4a6a-96d5-97f4a934eeed-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.518914 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" event={"ID":"dc0af2b0-0d8a-488e-97d3-5956869cd9e9","Type":"ContainerDied","Data":"7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed"} Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.518947 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b1e3733f8da811eebd49805a5d60b8ae598cd6d8814f49be0aaac853f5744ed" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.519508 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-aa1b-account-create-update-zsh2r" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.535152 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerStarted","Data":"9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec"} Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.548658 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" event={"ID":"e9edcf88-8f1d-419e-afd8-5e5861d5b5ad","Type":"ContainerDied","Data":"b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52"} Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.548704 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38349cbaa78379df8ef14e413dfece548aa998037017265020c6c1784f1da52" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.548767 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-77b2-account-create-update-v7shh" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.560997 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7k6w2" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.561325 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7k6w2" event={"ID":"34e0710f-88c5-4a6a-96d5-97f4a934eeed","Type":"ContainerDied","Data":"380efef30fd2a21fd38003aceee1b45022de4626ee25ab59d308d7f4f48961d5"} Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.561353 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="380efef30fd2a21fd38003aceee1b45022de4626ee25ab59d308d7f4f48961d5" Nov 26 15:08:28 crc kubenswrapper[4651]: I1126 15:08:28.568984 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f93566af-1613-4989-a326-26aa8cc4447c","Type":"ContainerStarted","Data":"e67e8c600127f62333eb638306a3d285a9e31c6f05bb4395ac2ad62e3ed66b59"} Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.133094 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.133399 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.453921 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:29 crc kubenswrapper[4651]: E1126 15:08:29.454094 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:29 crc kubenswrapper[4651]: E1126 15:08:29.454107 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:29 crc kubenswrapper[4651]: E1126 15:08:29.454151 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:45.454136278 +0000 UTC m=+1092.879883882 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.591684 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerStarted","Data":"b6695d3f447e669cdeca9118d84db68938a0ff9211df100dc76cf311d16e816f"} Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.593841 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f93566af-1613-4989-a326-26aa8cc4447c","Type":"ContainerStarted","Data":"6b85c74c531dfd82d41d68c9ec61747518f31552d836c7dff61437b4e867a809"} Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.595130 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 26 15:08:29 crc kubenswrapper[4651]: I1126 15:08:29.621504 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.621484179 podStartE2EDuration="4.621484179s" podCreationTimestamp="2025-11-26 15:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:08:29.615141913 +0000 UTC m=+1077.040889517" watchObservedRunningTime="2025-11-26 15:08:29.621484179 +0000 UTC m=+1077.047231783" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.325703 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shlzt"] Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326699 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e0710f-88c5-4a6a-96d5-97f4a934eeed" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326716 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e0710f-88c5-4a6a-96d5-97f4a934eeed" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326753 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326763 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326774 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0af2b0-0d8a-488e-97d3-5956869cd9e9" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326782 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0af2b0-0d8a-488e-97d3-5956869cd9e9" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326799 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5053f0e2-7865-4be5-9601-2c69da731509" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326808 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5053f0e2-7865-4be5-9601-2c69da731509" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326817 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce0bade-306a-4aa8-bbff-a24a79d73e22" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326824 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce0bade-306a-4aa8-bbff-a24a79d73e22" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: E1126 15:08:30.326845 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5279318-bb6d-455f-96a5-d410d0468c6b" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.326853 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5279318-bb6d-455f-96a5-d410d0468c6b" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327109 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e0710f-88c5-4a6a-96d5-97f4a934eeed" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327130 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327144 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5053f0e2-7865-4be5-9601-2c69da731509" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327162 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce0bade-306a-4aa8-bbff-a24a79d73e22" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327176 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0af2b0-0d8a-488e-97d3-5956869cd9e9" containerName="mariadb-account-create-update" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327187 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5279318-bb6d-455f-96a5-d410d0468c6b" containerName="mariadb-database-create" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.327924 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.330107 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.330350 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.334072 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wv4vg" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.358238 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shlzt"] Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.375703 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx8z\" (UniqueName: \"kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.375831 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.375884 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.375903 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.477067 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx8z\" (UniqueName: \"kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.477167 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.477197 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.477214 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.484742 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.484919 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.487692 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.498228 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx8z\" (UniqueName: \"kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z\") pod \"nova-cell0-conductor-db-sync-shlzt\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.603739 4651 generic.go:334] "Generic (PLEG): container finished" podID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerID="56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419" exitCode=137 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.603802 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerDied","Data":"56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419"} Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.603838 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerStarted","Data":"b24d36253d1184088df8f38e2aa41ad3371af1bbbe82d56ef4835ace475fee82"} Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.605978 4651 generic.go:334] "Generic (PLEG): container finished" podID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerID="bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394" exitCode=137 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.606017 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerDied","Data":"bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394"} Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.606046 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerStarted","Data":"a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef"} Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609542 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-central-agent" containerID="cri-o://ee4ed6061ae9adf8b4897d19bde876823c213e57da3abb6f07a6ce1843d09122" gracePeriod=30 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609718 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerStarted","Data":"054ebcb251000571f4801bf14ee8128d463b5bcbc60222d638b1951f7f55a25e"} Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609754 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609789 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="proxy-httpd" containerID="cri-o://054ebcb251000571f4801bf14ee8128d463b5bcbc60222d638b1951f7f55a25e" gracePeriod=30 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609829 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="sg-core" containerID="cri-o://b6695d3f447e669cdeca9118d84db68938a0ff9211df100dc76cf311d16e816f" gracePeriod=30 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.609862 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-notification-agent" containerID="cri-o://9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec" gracePeriod=30 Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.673482 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:08:30 crc kubenswrapper[4651]: I1126 15:08:30.683156 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.328162887 podStartE2EDuration="14.683137593s" podCreationTimestamp="2025-11-26 15:08:16 +0000 UTC" firstStartedPulling="2025-11-26 15:08:24.594136419 +0000 UTC m=+1072.019884023" lastFinishedPulling="2025-11-26 15:08:29.949111125 +0000 UTC m=+1077.374858729" observedRunningTime="2025-11-26 15:08:30.673840935 +0000 UTC m=+1078.099588549" watchObservedRunningTime="2025-11-26 15:08:30.683137593 +0000 UTC m=+1078.108885197" Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.021971 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shlzt"] Nov 26 15:08:31 crc kubenswrapper[4651]: E1126 15:08:31.232366 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a452fa_5305_40c6_a374_fb74225abd07.slice/crio-9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a452fa_5305_40c6_a374_fb74225abd07.slice/crio-conmon-9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec.scope\": RecentStats: unable to find data in memory cache]" Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.620943 4651 generic.go:334] "Generic (PLEG): container finished" podID="d1a452fa-5305-40c6-a374-fb74225abd07" containerID="054ebcb251000571f4801bf14ee8128d463b5bcbc60222d638b1951f7f55a25e" exitCode=0 Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.620979 4651 generic.go:334] "Generic (PLEG): container finished" podID="d1a452fa-5305-40c6-a374-fb74225abd07" containerID="b6695d3f447e669cdeca9118d84db68938a0ff9211df100dc76cf311d16e816f" exitCode=2 Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.620990 4651 generic.go:334] "Generic (PLEG): container finished" podID="d1a452fa-5305-40c6-a374-fb74225abd07" containerID="9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec" exitCode=0 Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.620984 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerDied","Data":"054ebcb251000571f4801bf14ee8128d463b5bcbc60222d638b1951f7f55a25e"} Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.621105 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerDied","Data":"b6695d3f447e669cdeca9118d84db68938a0ff9211df100dc76cf311d16e816f"} Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.621122 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerDied","Data":"9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec"} Nov 26 15:08:31 crc kubenswrapper[4651]: I1126 15:08:31.624253 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shlzt" event={"ID":"20922a1a-1763-45a9-911a-161e1fc4bd1e","Type":"ContainerStarted","Data":"f63c3fc37e41e8139ebb827afe061d9b57077c047447afb4d6ccb84e9f24baad"} Nov 26 15:08:35 crc kubenswrapper[4651]: I1126 15:08:35.670425 4651 generic.go:334] "Generic (PLEG): container finished" podID="d1a452fa-5305-40c6-a374-fb74225abd07" containerID="ee4ed6061ae9adf8b4897d19bde876823c213e57da3abb6f07a6ce1843d09122" exitCode=0 Nov 26 15:08:35 crc kubenswrapper[4651]: I1126 15:08:35.670639 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerDied","Data":"ee4ed6061ae9adf8b4897d19bde876823c213e57da3abb6f07a6ce1843d09122"} Nov 26 15:08:39 crc kubenswrapper[4651]: I1126 15:08:39.272291 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 26 15:08:39 crc kubenswrapper[4651]: I1126 15:08:39.612552 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:08:39 crc kubenswrapper[4651]: I1126 15:08:39.612617 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:08:39 crc kubenswrapper[4651]: I1126 15:08:39.781294 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:08:39 crc kubenswrapper[4651]: I1126 15:08:39.781637 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:08:43 crc kubenswrapper[4651]: I1126 15:08:43.918295 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="f93566af-1613-4989-a326-26aa8cc4447c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.175:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.217195 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.384894 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr5td\" (UniqueName: \"kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385049 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385092 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385160 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385192 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385227 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385271 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data\") pod \"d1a452fa-5305-40c6-a374-fb74225abd07\" (UID: \"d1a452fa-5305-40c6-a374-fb74225abd07\") " Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385572 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.385621 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.391465 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts" (OuterVolumeSpecName: "scripts") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.391518 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td" (OuterVolumeSpecName: "kube-api-access-qr5td") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "kube-api-access-qr5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.412771 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.480738 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488163 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488346 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr5td\" (UniqueName: \"kubernetes.io/projected/d1a452fa-5305-40c6-a374-fb74225abd07-kube-api-access-qr5td\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488359 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488368 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1a452fa-5305-40c6-a374-fb74225abd07-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488377 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488385 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.488394 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.489281 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.489300 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.489354 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:09:17.489340427 +0000 UTC m=+1124.915088031 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.515603 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data" (OuterVolumeSpecName: "config-data") pod "d1a452fa-5305-40c6-a374-fb74225abd07" (UID: "d1a452fa-5305-40c6-a374-fb74225abd07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.590193 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a452fa-5305-40c6-a374-fb74225abd07-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.759504 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1a452fa-5305-40c6-a374-fb74225abd07","Type":"ContainerDied","Data":"ff4fe0e54b44912660a9fa98dc06abfbdda5e1a71b25b0996c22853f426fe6a0"} Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.759783 4651 scope.go:117] "RemoveContainer" containerID="054ebcb251000571f4801bf14ee8128d463b5bcbc60222d638b1951f7f55a25e" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.759900 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.780679 4651 scope.go:117] "RemoveContainer" containerID="b6695d3f447e669cdeca9118d84db68938a0ff9211df100dc76cf311d16e816f" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.796215 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.835869 4651 scope.go:117] "RemoveContainer" containerID="9a26312e243cf48b45c5589ee4e25683eae9ed1c22850ec37ecab8b4a32db5ec" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.853952 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.868855 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.869552 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="proxy-httpd" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.869571 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="proxy-httpd" Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.869595 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-central-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.869602 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-central-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.869626 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-notification-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.869633 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-notification-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: E1126 15:08:45.869657 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="sg-core" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.869665 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="sg-core" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.870013 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="sg-core" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.870105 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-central-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.870122 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="ceilometer-notification-agent" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.870135 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" containerName="proxy-httpd" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.872873 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.874934 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.875278 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.881352 4651 scope.go:117] "RemoveContainer" containerID="ee4ed6061ae9adf8b4897d19bde876823c213e57da3abb6f07a6ce1843d09122" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.882296 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998209 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998242 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998356 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998390 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998412 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rlsj\" (UniqueName: \"kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998509 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:45 crc kubenswrapper[4651]: I1126 15:08:45.998556 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.100889 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.100992 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101136 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101169 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101312 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101355 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101408 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rlsj\" (UniqueName: \"kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.101552 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.102124 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.105850 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.108416 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.112457 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.124003 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rlsj\" (UniqueName: \"kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.124674 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts\") pod \"ceilometer-0\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.213948 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:46 crc kubenswrapper[4651]: W1126 15:08:46.719457 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9264d3e_322e_4b9b_9f37_fac3f5e0bf25.slice/crio-4084855609fb1c00446443830873c4dc36513a90fa0e102420104a4f356cc0e0 WatchSource:0}: Error finding container 4084855609fb1c00446443830873c4dc36513a90fa0e102420104a4f356cc0e0: Status 404 returned error can't find the container with id 4084855609fb1c00446443830873c4dc36513a90fa0e102420104a4f356cc0e0 Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.728391 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:46 crc kubenswrapper[4651]: I1126 15:08:46.775763 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerStarted","Data":"4084855609fb1c00446443830873c4dc36513a90fa0e102420104a4f356cc0e0"} Nov 26 15:08:46 crc kubenswrapper[4651]: E1126 15:08:46.785728 4651 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Nov 26 15:08:46 crc kubenswrapper[4651]: E1126 15:08:46.785898 4651 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbx8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-shlzt_openstack(20922a1a-1763-45a9-911a-161e1fc4bd1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 26 15:08:46 crc kubenswrapper[4651]: E1126 15:08:46.787418 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-shlzt" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" Nov 26 15:08:47 crc kubenswrapper[4651]: I1126 15:08:47.422262 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a452fa-5305-40c6-a374-fb74225abd07" path="/var/lib/kubelet/pods/d1a452fa-5305-40c6-a374-fb74225abd07/volumes" Nov 26 15:08:47 crc kubenswrapper[4651]: E1126 15:08:47.794613 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-shlzt" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" Nov 26 15:08:48 crc kubenswrapper[4651]: I1126 15:08:48.803916 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerStarted","Data":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} Nov 26 15:08:48 crc kubenswrapper[4651]: I1126 15:08:48.805219 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerStarted","Data":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} Nov 26 15:08:49 crc kubenswrapper[4651]: I1126 15:08:49.614462 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:08:49 crc kubenswrapper[4651]: I1126 15:08:49.782765 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:08:49 crc kubenswrapper[4651]: I1126 15:08:49.813842 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerStarted","Data":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} Nov 26 15:08:51 crc kubenswrapper[4651]: I1126 15:08:51.833949 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerStarted","Data":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} Nov 26 15:08:51 crc kubenswrapper[4651]: I1126 15:08:51.834628 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:08:51 crc kubenswrapper[4651]: I1126 15:08:51.856865 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.158508154 podStartE2EDuration="6.85684256s" podCreationTimestamp="2025-11-26 15:08:45 +0000 UTC" firstStartedPulling="2025-11-26 15:08:46.722241418 +0000 UTC m=+1094.147989022" lastFinishedPulling="2025-11-26 15:08:51.420575824 +0000 UTC m=+1098.846323428" observedRunningTime="2025-11-26 15:08:51.852454178 +0000 UTC m=+1099.278201782" watchObservedRunningTime="2025-11-26 15:08:51.85684256 +0000 UTC m=+1099.282590164" Nov 26 15:08:55 crc kubenswrapper[4651]: I1126 15:08:55.678574 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:08:55 crc kubenswrapper[4651]: I1126 15:08:55.679269 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-log" containerID="cri-o://cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b" gracePeriod=30 Nov 26 15:08:55 crc kubenswrapper[4651]: I1126 15:08:55.679363 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-httpd" containerID="cri-o://1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88" gracePeriod=30 Nov 26 15:08:55 crc kubenswrapper[4651]: I1126 15:08:55.873120 4651 generic.go:334] "Generic (PLEG): container finished" podID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerID="cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b" exitCode=143 Nov 26 15:08:55 crc kubenswrapper[4651]: I1126 15:08:55.873397 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerDied","Data":"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b"} Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.097966 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.098746 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-httpd" containerID="cri-o://9ec0d32784ed5ced6da55fc86d918723f8ec5b5e23f395de2b4b6ddd05c4482a" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.098590 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-log" containerID="cri-o://b18db86a93db98c43ae81c9e05be8dec3a64dcb4461b4f2f64d198b8779af2c0" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.891574 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.891867 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-central-agent" containerID="cri-o://b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.892240 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="proxy-httpd" containerID="cri-o://d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.892452 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-notification-agent" containerID="cri-o://dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.892505 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="sg-core" containerID="cri-o://edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" gracePeriod=30 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.906604 4651 generic.go:334] "Generic (PLEG): container finished" podID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerID="b18db86a93db98c43ae81c9e05be8dec3a64dcb4461b4f2f64d198b8779af2c0" exitCode=143 Nov 26 15:08:57 crc kubenswrapper[4651]: I1126 15:08:57.906653 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerDied","Data":"b18db86a93db98c43ae81c9e05be8dec3a64dcb4461b4f2f64d198b8779af2c0"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.819060 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846118 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846164 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846229 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846265 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846323 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846344 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rlsj\" (UniqueName: \"kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.846381 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd\") pod \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\" (UID: \"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25\") " Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.847136 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.847325 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.854026 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts" (OuterVolumeSpecName: "scripts") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.862337 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj" (OuterVolumeSpecName: "kube-api-access-6rlsj") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "kube-api-access-6rlsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.947845 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.948173 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.948183 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rlsj\" (UniqueName: \"kubernetes.io/projected/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-kube-api-access-6rlsj\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.948195 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.949208 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.966972 4651 generic.go:334] "Generic (PLEG): container finished" podID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" exitCode=0 Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967003 4651 generic.go:334] "Generic (PLEG): container finished" podID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" exitCode=2 Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967012 4651 generic.go:334] "Generic (PLEG): container finished" podID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" exitCode=0 Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967020 4651 generic.go:334] "Generic (PLEG): container finished" podID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" exitCode=0 Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967074 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerDied","Data":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967101 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerDied","Data":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967111 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerDied","Data":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967120 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerDied","Data":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967129 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9264d3e-322e-4b9b-9f37-fac3f5e0bf25","Type":"ContainerDied","Data":"4084855609fb1c00446443830873c4dc36513a90fa0e102420104a4f356cc0e0"} Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967144 4651 scope.go:117] "RemoveContainer" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:58 crc kubenswrapper[4651]: I1126 15:08:58.967276 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.025104 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.039320 4651 scope.go:117] "RemoveContainer" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.048907 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.049399 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.059083 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data" (OuterVolumeSpecName: "config-data") pod "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" (UID: "c9264d3e-322e-4b9b-9f37-fac3f5e0bf25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.074090 4651 scope.go:117] "RemoveContainer" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.108063 4651 scope.go:117] "RemoveContainer" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.133063 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.133135 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.151003 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.151137 4651 scope.go:117] "RemoveContainer" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.151729 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": container with ID starting with d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0 not found: ID does not exist" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.151775 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} err="failed to get container status \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": rpc error: code = NotFound desc = could not find container \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": container with ID starting with d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.151803 4651 scope.go:117] "RemoveContainer" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.152535 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": container with ID starting with edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10 not found: ID does not exist" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.152574 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} err="failed to get container status \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": rpc error: code = NotFound desc = could not find container \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": container with ID starting with edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.152593 4651 scope.go:117] "RemoveContainer" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.153352 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": container with ID starting with dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa not found: ID does not exist" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.153382 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} err="failed to get container status \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": rpc error: code = NotFound desc = could not find container \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": container with ID starting with dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.153399 4651 scope.go:117] "RemoveContainer" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.153830 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": container with ID starting with b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1 not found: ID does not exist" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.153878 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} err="failed to get container status \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": rpc error: code = NotFound desc = could not find container \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": container with ID starting with b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.153910 4651 scope.go:117] "RemoveContainer" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.154547 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} err="failed to get container status \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": rpc error: code = NotFound desc = could not find container \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": container with ID starting with d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.154569 4651 scope.go:117] "RemoveContainer" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.154856 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} err="failed to get container status \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": rpc error: code = NotFound desc = could not find container \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": container with ID starting with edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.154877 4651 scope.go:117] "RemoveContainer" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.155200 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} err="failed to get container status \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": rpc error: code = NotFound desc = could not find container \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": container with ID starting with dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.155220 4651 scope.go:117] "RemoveContainer" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.155550 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} err="failed to get container status \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": rpc error: code = NotFound desc = could not find container \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": container with ID starting with b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.155572 4651 scope.go:117] "RemoveContainer" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.156902 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} err="failed to get container status \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": rpc error: code = NotFound desc = could not find container \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": container with ID starting with d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.156974 4651 scope.go:117] "RemoveContainer" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157530 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} err="failed to get container status \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": rpc error: code = NotFound desc = could not find container \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": container with ID starting with edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157555 4651 scope.go:117] "RemoveContainer" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157781 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} err="failed to get container status \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": rpc error: code = NotFound desc = could not find container \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": container with ID starting with dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157800 4651 scope.go:117] "RemoveContainer" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157973 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} err="failed to get container status \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": rpc error: code = NotFound desc = could not find container \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": container with ID starting with b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.157990 4651 scope.go:117] "RemoveContainer" containerID="d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.158853 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0"} err="failed to get container status \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": rpc error: code = NotFound desc = could not find container \"d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0\": container with ID starting with d6e866dba4ef00d537abe0db659ba5744ac9cf08928acf44d065dd1b1ff363d0 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.158871 4651 scope.go:117] "RemoveContainer" containerID="edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.159367 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10"} err="failed to get container status \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": rpc error: code = NotFound desc = could not find container \"edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10\": container with ID starting with edc9b21f3d1869c6b72c7927865347300f46a36be00133b00c19857168d8ed10 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.159383 4651 scope.go:117] "RemoveContainer" containerID="dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.159690 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa"} err="failed to get container status \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": rpc error: code = NotFound desc = could not find container \"dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa\": container with ID starting with dd1df9e172ae55599c08e3dbb674d2fc0af9325d01d11727eb0bd0cf41b7b5aa not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.159713 4651 scope.go:117] "RemoveContainer" containerID="b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.160322 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1"} err="failed to get container status \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": rpc error: code = NotFound desc = could not find container \"b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1\": container with ID starting with b0219f3a4eab94c5889f0ddafeb395796fb18ab3aa392408a72ea82d305199d1 not found: ID does not exist" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.361206 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.386099 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390073 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.390483 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-notification-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390498 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-notification-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.390514 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="sg-core" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390522 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="sg-core" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.390533 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="proxy-httpd" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390540 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="proxy-httpd" Nov 26 15:08:59 crc kubenswrapper[4651]: E1126 15:08:59.390565 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-central-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390572 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-central-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390799 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="proxy-httpd" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390823 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-notification-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390840 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="sg-core" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.390852 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" containerName="ceilometer-central-agent" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.392918 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.395441 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.396374 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.457883 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9264d3e-322e-4b9b-9f37-fac3f5e0bf25" path="/var/lib/kubelet/pods/c9264d3e-322e-4b9b-9f37-fac3f5e0bf25/volumes" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.459029 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.558985 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.565689 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.565791 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.565942 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.566131 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.566179 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.566211 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mbg\" (UniqueName: \"kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.566293 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.622003 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667153 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667249 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667286 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667317 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667336 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667390 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667439 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csjxj\" (UniqueName: \"kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667537 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle\") pod \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\" (UID: \"7194a748-fcec-46b2-b6b7-a3af88cd8e14\") " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667765 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667835 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667888 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667926 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.667954 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mbg\" (UniqueName: \"kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.668001 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.668083 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.669442 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.670489 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.679561 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts" (OuterVolumeSpecName: "scripts") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.679897 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs" (OuterVolumeSpecName: "logs") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.681562 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.681941 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.685136 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.688648 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.688825 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj" (OuterVolumeSpecName: "kube-api-access-csjxj") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "kube-api-access-csjxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.696166 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.703283 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.734988 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mbg\" (UniqueName: \"kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg\") pod \"ceilometer-0\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.746183 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770624 4651 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770654 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770663 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7194a748-fcec-46b2-b6b7-a3af88cd8e14-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770689 4651 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770699 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csjxj\" (UniqueName: \"kubernetes.io/projected/7194a748-fcec-46b2-b6b7-a3af88cd8e14-kube-api-access-csjxj\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.770709 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.783983 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.789908 4651 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.792656 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data" (OuterVolumeSpecName: "config-data") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.805234 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7194a748-fcec-46b2-b6b7-a3af88cd8e14" (UID: "7194a748-fcec-46b2-b6b7-a3af88cd8e14"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.856162 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.872543 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.873068 4651 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7194a748-fcec-46b2-b6b7-a3af88cd8e14-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:08:59 crc kubenswrapper[4651]: I1126 15:08:59.873140 4651 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:08:59.999630 4651 generic.go:334] "Generic (PLEG): container finished" podID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerID="1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88" exitCode=0 Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.000067 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.000347 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerDied","Data":"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88"} Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.000375 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7194a748-fcec-46b2-b6b7-a3af88cd8e14","Type":"ContainerDied","Data":"390f2bf62f6eb34eef45fe1638494d67d84583759fb9ae8626a53a3607ff275f"} Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.000389 4651 scope.go:117] "RemoveContainer" containerID="1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.047986 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.080000 4651 scope.go:117] "RemoveContainer" containerID="cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.104962 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.144225 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:09:00 crc kubenswrapper[4651]: E1126 15:09:00.144652 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-log" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.144690 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-log" Nov 26 15:09:00 crc kubenswrapper[4651]: E1126 15:09:00.144736 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-httpd" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.144746 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-httpd" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.144992 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-httpd" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.145028 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" containerName="glance-log" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.145928 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.148767 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.149343 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.166935 4651 scope.go:117] "RemoveContainer" containerID="1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.167025 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:09:00 crc kubenswrapper[4651]: E1126 15:09:00.183761 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88\": container with ID starting with 1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88 not found: ID does not exist" containerID="1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.183895 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88"} err="failed to get container status \"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88\": rpc error: code = NotFound desc = could not find container \"1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88\": container with ID starting with 1c54b61b9159f23cf47729a57caf23b1e176892fa550c212f0c050da72adcf88 not found: ID does not exist" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.183971 4651 scope.go:117] "RemoveContainer" containerID="cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b" Nov 26 15:09:00 crc kubenswrapper[4651]: E1126 15:09:00.185081 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b\": container with ID starting with cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b not found: ID does not exist" containerID="cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.185140 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b"} err="failed to get container status \"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b\": rpc error: code = NotFound desc = could not find container \"cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b\": container with ID starting with cadcc5c040f2d209120db7086f83ad4bdb413fc721027c3419ec41f1b5de104b not found: ID does not exist" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.200100 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.200179 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.200239 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.203147 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.203202 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.203304 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-logs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.203351 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztm7f\" (UniqueName: \"kubernetes.io/projected/49ff8933-59b2-4620-a03d-a14767db747d-kube-api-access-ztm7f\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.203462 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305418 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-logs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305743 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztm7f\" (UniqueName: \"kubernetes.io/projected/49ff8933-59b2-4620-a03d-a14767db747d-kube-api-access-ztm7f\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305805 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305856 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305884 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305916 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305953 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.305975 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.306328 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-logs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.306546 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49ff8933-59b2-4620-a03d-a14767db747d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.308324 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.320658 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.320939 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.321851 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-config-data\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.322568 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ff8933-59b2-4620-a03d-a14767db747d-scripts\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.353680 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.363967 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztm7f\" (UniqueName: \"kubernetes.io/projected/49ff8933-59b2-4620-a03d-a14767db747d-kube-api-access-ztm7f\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.389120 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"49ff8933-59b2-4620-a03d-a14767db747d\") " pod="openstack/glance-default-external-api-0" Nov 26 15:09:00 crc kubenswrapper[4651]: I1126 15:09:00.485127 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.045492 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.048283 4651 generic.go:334] "Generic (PLEG): container finished" podID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerID="9ec0d32784ed5ced6da55fc86d918723f8ec5b5e23f395de2b4b6ddd05c4482a" exitCode=0 Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.048390 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerDied","Data":"9ec0d32784ed5ced6da55fc86d918723f8ec5b5e23f395de2b4b6ddd05c4482a"} Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.048420 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb","Type":"ContainerDied","Data":"6cd4b5123608a3c9f55fa91ce3ce8580fa91d20c75361541f3add47ebad54264"} Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.048433 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cd4b5123608a3c9f55fa91ce3ce8580fa91d20c75361541f3add47ebad54264" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.049550 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerStarted","Data":"24eaace5342091f657b0f26a66c875c216ba8f91e882d1e12d2f1c7dc9079a95"} Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.050671 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shlzt" event={"ID":"20922a1a-1763-45a9-911a-161e1fc4bd1e","Type":"ContainerStarted","Data":"685533ab88cef62222b040d35727bbfefc8d1e3e65af18434ed7296f91609871"} Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.088588 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-shlzt" podStartSLOduration=2.11516518 podStartE2EDuration="31.088570391s" podCreationTimestamp="2025-11-26 15:08:30 +0000 UTC" firstStartedPulling="2025-11-26 15:08:31.061788587 +0000 UTC m=+1078.487536191" lastFinishedPulling="2025-11-26 15:09:00.035193798 +0000 UTC m=+1107.460941402" observedRunningTime="2025-11-26 15:09:01.082617418 +0000 UTC m=+1108.508365042" watchObservedRunningTime="2025-11-26 15:09:01.088570391 +0000 UTC m=+1108.514317995" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230150 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230593 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230666 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230717 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzdkv\" (UniqueName: \"kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230731 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230747 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230762 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.230877 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\" (UID: \"cd7e69e3-90c8-4f33-94cb-bf972e5a72bb\") " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.231277 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs" (OuterVolumeSpecName: "logs") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.231379 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.231480 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.238613 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts" (OuterVolumeSpecName: "scripts") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.247232 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv" (OuterVolumeSpecName: "kube-api-access-dzdkv") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "kube-api-access-dzdkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.251233 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.301721 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.306699 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data" (OuterVolumeSpecName: "config-data") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.324211 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" (UID: "cd7e69e3-90c8-4f33-94cb-bf972e5a72bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.329919 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333080 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzdkv\" (UniqueName: \"kubernetes.io/projected/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-kube-api-access-dzdkv\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333113 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333125 4651 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333135 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333164 4651 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333181 4651 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.333194 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.369880 4651 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.416972 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7194a748-fcec-46b2-b6b7-a3af88cd8e14" path="/var/lib/kubelet/pods/7194a748-fcec-46b2-b6b7-a3af88cd8e14/volumes" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.434914 4651 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:01 crc kubenswrapper[4651]: I1126 15:09:01.482950 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.071677 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ff8933-59b2-4620-a03d-a14767db747d","Type":"ContainerStarted","Data":"737d5221b85be50ae57e5f9481f6660c3cfb3f8b76c2fbe7e7fe59bcada28a7b"} Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.075476 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.076494 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerStarted","Data":"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1"} Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.375523 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.385384 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.453149 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:09:02 crc kubenswrapper[4651]: E1126 15:09:02.454115 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-log" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.454132 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-log" Nov 26 15:09:02 crc kubenswrapper[4651]: E1126 15:09:02.454148 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-httpd" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.454155 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-httpd" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.454498 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-httpd" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.454538 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" containerName="glance-log" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.457718 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.499323 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.500361 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.500514 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559186 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559255 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559311 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-logs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559326 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559371 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bv9t\" (UniqueName: \"kubernetes.io/projected/2760c509-e00a-4aab-9471-0cf7f5177471-kube-api-access-6bv9t\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559400 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559458 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.559535 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665263 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665343 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665392 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665444 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-logs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665460 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665484 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bv9t\" (UniqueName: \"kubernetes.io/projected/2760c509-e00a-4aab-9471-0cf7f5177471-kube-api-access-6bv9t\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665525 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.665560 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.667688 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-logs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.668546 4651 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.668752 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2760c509-e00a-4aab-9471-0cf7f5177471-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.679824 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.686670 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.690338 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.694189 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2760c509-e00a-4aab-9471-0cf7f5177471-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.703810 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bv9t\" (UniqueName: \"kubernetes.io/projected/2760c509-e00a-4aab-9471-0cf7f5177471-kube-api-access-6bv9t\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.748510 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"2760c509-e00a-4aab-9471-0cf7f5177471\") " pod="openstack/glance-default-internal-api-0" Nov 26 15:09:02 crc kubenswrapper[4651]: I1126 15:09:02.905591 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:03 crc kubenswrapper[4651]: I1126 15:09:03.140710 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ff8933-59b2-4620-a03d-a14767db747d","Type":"ContainerStarted","Data":"44e9199ec972e87ff810543b05eca993d51b5958e3c9fe5b316d598cf0aec169"} Nov 26 15:09:03 crc kubenswrapper[4651]: I1126 15:09:03.166690 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerStarted","Data":"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d"} Nov 26 15:09:03 crc kubenswrapper[4651]: I1126 15:09:03.417995 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7e69e3-90c8-4f33-94cb-bf972e5a72bb" path="/var/lib/kubelet/pods/cd7e69e3-90c8-4f33-94cb-bf972e5a72bb/volumes" Nov 26 15:09:03 crc kubenswrapper[4651]: I1126 15:09:03.607498 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 26 15:09:04 crc kubenswrapper[4651]: I1126 15:09:04.190634 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"49ff8933-59b2-4620-a03d-a14767db747d","Type":"ContainerStarted","Data":"a1f7ca352b465b3f3f1252f751506452025ed889f95a4cf06eefe1add1fd90fc"} Nov 26 15:09:04 crc kubenswrapper[4651]: I1126 15:09:04.196232 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2760c509-e00a-4aab-9471-0cf7f5177471","Type":"ContainerStarted","Data":"8fbdc81cc368aa4f5f7b0c169f6a4969d2bc7d9376a4ff32fd00175c3905f866"} Nov 26 15:09:04 crc kubenswrapper[4651]: I1126 15:09:04.231580 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.231562572 podStartE2EDuration="4.231562572s" podCreationTimestamp="2025-11-26 15:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:04.210722853 +0000 UTC m=+1111.636470477" watchObservedRunningTime="2025-11-26 15:09:04.231562572 +0000 UTC m=+1111.657310176" Nov 26 15:09:05 crc kubenswrapper[4651]: I1126 15:09:05.207233 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerStarted","Data":"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c"} Nov 26 15:09:05 crc kubenswrapper[4651]: I1126 15:09:05.211392 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2760c509-e00a-4aab-9471-0cf7f5177471","Type":"ContainerStarted","Data":"26dec7a15dbce0fffb9e198caf1b30114495136ec70a9b31cdf9900af7c51bbe"} Nov 26 15:09:05 crc kubenswrapper[4651]: I1126 15:09:05.211416 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2760c509-e00a-4aab-9471-0cf7f5177471","Type":"ContainerStarted","Data":"6687a819b3ed774983af2706baaf73aaea89d02d8d82ee5e980641b334c41f39"} Nov 26 15:09:05 crc kubenswrapper[4651]: I1126 15:09:05.238804 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.2387825550000002 podStartE2EDuration="3.238782555s" podCreationTimestamp="2025-11-26 15:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:05.232572836 +0000 UTC m=+1112.658320450" watchObservedRunningTime="2025-11-26 15:09:05.238782555 +0000 UTC m=+1112.664530159" Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228201 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-central-agent" containerID="cri-o://d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1" gracePeriod=30 Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228638 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerStarted","Data":"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88"} Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228803 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="proxy-httpd" containerID="cri-o://fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88" gracePeriod=30 Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228900 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-notification-agent" containerID="cri-o://420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d" gracePeriod=30 Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228902 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="sg-core" containerID="cri-o://076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c" gracePeriod=30 Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.228945 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:09:06 crc kubenswrapper[4651]: I1126 15:09:06.259410 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.820018946 podStartE2EDuration="7.259394913s" podCreationTimestamp="2025-11-26 15:08:59 +0000 UTC" firstStartedPulling="2025-11-26 15:09:00.38120154 +0000 UTC m=+1107.806949144" lastFinishedPulling="2025-11-26 15:09:05.820577517 +0000 UTC m=+1113.246325111" observedRunningTime="2025-11-26 15:09:06.258507909 +0000 UTC m=+1113.684255523" watchObservedRunningTime="2025-11-26 15:09:06.259394913 +0000 UTC m=+1113.685142517" Nov 26 15:09:07 crc kubenswrapper[4651]: I1126 15:09:07.241506 4651 generic.go:334] "Generic (PLEG): container finished" podID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerID="076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c" exitCode=2 Nov 26 15:09:07 crc kubenswrapper[4651]: I1126 15:09:07.241538 4651 generic.go:334] "Generic (PLEG): container finished" podID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerID="420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d" exitCode=0 Nov 26 15:09:07 crc kubenswrapper[4651]: I1126 15:09:07.241559 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerDied","Data":"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c"} Nov 26 15:09:07 crc kubenswrapper[4651]: I1126 15:09:07.241583 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerDied","Data":"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d"} Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.612713 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.613107 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.613936 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef"} pod="openstack/horizon-6974b49b94-vzn8h" containerMessage="Container horizon failed startup probe, will be restarted" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.613984 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" containerID="cri-o://a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef" gracePeriod=30 Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.781982 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.782570 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.783555 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"b24d36253d1184088df8f38e2aa41ad3371af1bbbe82d56ef4835ace475fee82"} pod="openstack/horizon-f54c7c77d-rx8gm" containerMessage="Container horizon failed startup probe, will be restarted" Nov 26 15:09:09 crc kubenswrapper[4651]: I1126 15:09:09.783693 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" containerID="cri-o://b24d36253d1184088df8f38e2aa41ad3371af1bbbe82d56ef4835ace475fee82" gracePeriod=30 Nov 26 15:09:10 crc kubenswrapper[4651]: I1126 15:09:10.485944 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:09:10 crc kubenswrapper[4651]: I1126 15:09:10.485998 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 26 15:09:10 crc kubenswrapper[4651]: I1126 15:09:10.529496 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:09:10 crc kubenswrapper[4651]: I1126 15:09:10.534824 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 26 15:09:11 crc kubenswrapper[4651]: I1126 15:09:11.281314 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:09:11 crc kubenswrapper[4651]: I1126 15:09:11.281372 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.292816 4651 generic.go:334] "Generic (PLEG): container finished" podID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerID="d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1" exitCode=0 Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.292886 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerDied","Data":"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1"} Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.906477 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.906779 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.942963 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:12 crc kubenswrapper[4651]: I1126 15:09:12.954492 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:13 crc kubenswrapper[4651]: I1126 15:09:13.301295 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:13 crc kubenswrapper[4651]: I1126 15:09:13.301331 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:13 crc kubenswrapper[4651]: I1126 15:09:13.833086 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:09:13 crc kubenswrapper[4651]: I1126 15:09:13.833224 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:09:13 crc kubenswrapper[4651]: I1126 15:09:13.840654 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 26 15:09:15 crc kubenswrapper[4651]: I1126 15:09:15.317678 4651 generic.go:334] "Generic (PLEG): container finished" podID="20922a1a-1763-45a9-911a-161e1fc4bd1e" containerID="685533ab88cef62222b040d35727bbfefc8d1e3e65af18434ed7296f91609871" exitCode=0 Nov 26 15:09:15 crc kubenswrapper[4651]: I1126 15:09:15.317760 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shlzt" event={"ID":"20922a1a-1763-45a9-911a-161e1fc4bd1e","Type":"ContainerDied","Data":"685533ab88cef62222b040d35727bbfefc8d1e3e65af18434ed7296f91609871"} Nov 26 15:09:15 crc kubenswrapper[4651]: I1126 15:09:15.554085 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:15 crc kubenswrapper[4651]: I1126 15:09:15.554183 4651 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:09:15 crc kubenswrapper[4651]: I1126 15:09:15.555220 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.702921 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.822807 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle\") pod \"20922a1a-1763-45a9-911a-161e1fc4bd1e\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.822901 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data\") pod \"20922a1a-1763-45a9-911a-161e1fc4bd1e\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.822923 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbx8z\" (UniqueName: \"kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z\") pod \"20922a1a-1763-45a9-911a-161e1fc4bd1e\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.822943 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts\") pod \"20922a1a-1763-45a9-911a-161e1fc4bd1e\" (UID: \"20922a1a-1763-45a9-911a-161e1fc4bd1e\") " Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.829639 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts" (OuterVolumeSpecName: "scripts") pod "20922a1a-1763-45a9-911a-161e1fc4bd1e" (UID: "20922a1a-1763-45a9-911a-161e1fc4bd1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.830115 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z" (OuterVolumeSpecName: "kube-api-access-nbx8z") pod "20922a1a-1763-45a9-911a-161e1fc4bd1e" (UID: "20922a1a-1763-45a9-911a-161e1fc4bd1e"). InnerVolumeSpecName "kube-api-access-nbx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.864756 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data" (OuterVolumeSpecName: "config-data") pod "20922a1a-1763-45a9-911a-161e1fc4bd1e" (UID: "20922a1a-1763-45a9-911a-161e1fc4bd1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.882382 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20922a1a-1763-45a9-911a-161e1fc4bd1e" (UID: "20922a1a-1763-45a9-911a-161e1fc4bd1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.925310 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbx8z\" (UniqueName: \"kubernetes.io/projected/20922a1a-1763-45a9-911a-161e1fc4bd1e-kube-api-access-nbx8z\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.925351 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.925362 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:16 crc kubenswrapper[4651]: I1126 15:09:16.925372 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20922a1a-1763-45a9-911a-161e1fc4bd1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.337690 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shlzt" event={"ID":"20922a1a-1763-45a9-911a-161e1fc4bd1e","Type":"ContainerDied","Data":"f63c3fc37e41e8139ebb827afe061d9b57077c047447afb4d6ccb84e9f24baad"} Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.337733 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f63c3fc37e41e8139ebb827afe061d9b57077c047447afb4d6ccb84e9f24baad" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.337792 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shlzt" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.446110 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:09:17 crc kubenswrapper[4651]: E1126 15:09:17.446505 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" containerName="nova-cell0-conductor-db-sync" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.446522 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" containerName="nova-cell0-conductor-db-sync" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.446673 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" containerName="nova-cell0-conductor-db-sync" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.447342 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.450142 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.453760 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wv4vg" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.460684 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.536154 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:09:17 crc kubenswrapper[4651]: E1126 15:09:17.536401 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:09:17 crc kubenswrapper[4651]: E1126 15:09:17.536439 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:09:17 crc kubenswrapper[4651]: E1126 15:09:17.536506 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:10:21.536485218 +0000 UTC m=+1188.962232822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.637776 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f4t9\" (UniqueName: \"kubernetes.io/projected/a4e5f466-486a-4c2c-8cd7-528169932031-kube-api-access-8f4t9\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.637849 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.637996 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.740389 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f4t9\" (UniqueName: \"kubernetes.io/projected/a4e5f466-486a-4c2c-8cd7-528169932031-kube-api-access-8f4t9\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.740443 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.740499 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.751195 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.757548 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e5f466-486a-4c2c-8cd7-528169932031-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:17 crc kubenswrapper[4651]: I1126 15:09:17.773292 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f4t9\" (UniqueName: \"kubernetes.io/projected/a4e5f466-486a-4c2c-8cd7-528169932031-kube-api-access-8f4t9\") pod \"nova-cell0-conductor-0\" (UID: \"a4e5f466-486a-4c2c-8cd7-528169932031\") " pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:18 crc kubenswrapper[4651]: I1126 15:09:18.064218 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:18 crc kubenswrapper[4651]: I1126 15:09:18.497009 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 26 15:09:19 crc kubenswrapper[4651]: I1126 15:09:19.357901 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4e5f466-486a-4c2c-8cd7-528169932031","Type":"ContainerStarted","Data":"ea8eeb3cbdb1e06938ec8ab02ca6fb071c3021b85528dfcdbb624ba6fb63c3f3"} Nov 26 15:09:19 crc kubenswrapper[4651]: I1126 15:09:19.358297 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a4e5f466-486a-4c2c-8cd7-528169932031","Type":"ContainerStarted","Data":"085a0f31a7d7b1372163c3a8881574bdca67d3bf8a97aa9c64a6106f7488633a"} Nov 26 15:09:19 crc kubenswrapper[4651]: I1126 15:09:19.358338 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:19 crc kubenswrapper[4651]: I1126 15:09:19.377237 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.377219058 podStartE2EDuration="2.377219058s" podCreationTimestamp="2025-11-26 15:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:19.375929922 +0000 UTC m=+1126.801677526" watchObservedRunningTime="2025-11-26 15:09:19.377219058 +0000 UTC m=+1126.802966662" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.092278 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.702645 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tpjxc"] Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.704291 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.706536 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.711843 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.717727 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tpjxc"] Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.753614 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.753833 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvtl\" (UniqueName: \"kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.753968 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.754366 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.856206 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.857379 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.857463 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvtl\" (UniqueName: \"kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.857582 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.864819 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.868418 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.870023 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.932688 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvtl\" (UniqueName: \"kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl\") pod \"nova-cell0-cell-mapping-tpjxc\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.978336 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.981001 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:23 crc kubenswrapper[4651]: I1126 15:09:23.988431 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.007653 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.062850 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.086275 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.087672 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.111281 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.161417 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.165645 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk8hd\" (UniqueName: \"kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.165876 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.166067 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.166163 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.199680 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.201120 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.218316 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.225165 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.269848 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.269895 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.269914 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.269932 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.269956 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.270008 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhl5\" (UniqueName: \"kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.270092 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk8hd\" (UniqueName: \"kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.270116 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.270721 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.293881 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.295510 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.333063 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.334604 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.340419 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.365112 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk8hd\" (UniqueName: \"kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd\") pod \"nova-api-0\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371484 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhl5\" (UniqueName: \"kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371562 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz8x2\" (UniqueName: \"kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371622 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371743 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371785 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371811 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.371836 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.375163 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.375974 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.376971 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.377534 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.380520 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.382497 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.389368 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.405783 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhl5\" (UniqueName: \"kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5\") pod \"nova-metadata-0\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473350 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473594 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473617 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57lb\" (UniqueName: \"kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473654 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473690 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473730 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473781 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz8x2\" (UniqueName: \"kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473799 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lgk\" (UniqueName: \"kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473818 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473845 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.473880 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.481565 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.490599 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.499174 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz8x2\" (UniqueName: \"kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2\") pod \"nova-scheduler-0\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.574638 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575723 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575794 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575855 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575874 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575898 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l57lb\" (UniqueName: \"kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.575935 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.576008 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.576229 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lgk\" (UniqueName: \"kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.576801 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.577406 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.580243 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.581630 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.582101 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.597645 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.598753 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l57lb\" (UniqueName: \"kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb\") pod \"dnsmasq-dns-56d99cc479-5blqc\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.608849 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lgk\" (UniqueName: \"kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.609096 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.656156 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.669551 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.722096 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:24 crc kubenswrapper[4651]: I1126 15:09:24.937101 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tpjxc"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.038978 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.312502 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qmd5q"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.315406 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.319466 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.319888 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.326222 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qmd5q"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.430793 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerStarted","Data":"3f23a0f6894062e1d8514637db40d67cf7dd866781cc0f6af004434b9ade85cb"} Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.432599 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tpjxc" event={"ID":"ad85fcab-3573-4019-89bc-f35413ff0a9d","Type":"ContainerStarted","Data":"9eec96ebad56c4bde87309892af8528ac22137803befe3b50c792a3509d4efc1"} Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.432634 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tpjxc" event={"ID":"ad85fcab-3573-4019-89bc-f35413ff0a9d","Type":"ContainerStarted","Data":"b974712bc08c246a869f9ba6b0cbdbc26d1cc34162b83cf4bc81cbe42249cd92"} Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.464776 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tpjxc" podStartSLOduration=2.46475955 podStartE2EDuration="2.46475955s" podCreationTimestamp="2025-11-26 15:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:25.461404808 +0000 UTC m=+1132.887152412" watchObservedRunningTime="2025-11-26 15:09:25.46475955 +0000 UTC m=+1132.890507154" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.499825 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.500190 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.500288 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.500701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4lt\" (UniqueName: \"kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.546623 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.555166 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.603406 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.603507 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.603548 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.603610 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4lt\" (UniqueName: \"kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.609221 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.609831 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.610659 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.626718 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4lt\" (UniqueName: \"kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt\") pod \"nova-cell1-conductor-db-sync-qmd5q\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.642527 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.658556 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:25 crc kubenswrapper[4651]: I1126 15:09:25.685931 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.169531 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qmd5q"] Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.467691 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" event={"ID":"86a2f131-c449-4541-9822-75711dee8ad3","Type":"ContainerStarted","Data":"6ca2e021e879a39af180b2ead0195b0933490ae1da77ea0e233e90766f4e01c4"} Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.471771 4651 generic.go:334] "Generic (PLEG): container finished" podID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerID="e876c628b4ef2f370ad838ef81ff84f8987828adb52bf9265dd04beaeb25d5cd" exitCode=0 Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.471855 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" event={"ID":"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4","Type":"ContainerDied","Data":"e876c628b4ef2f370ad838ef81ff84f8987828adb52bf9265dd04beaeb25d5cd"} Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.471884 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" event={"ID":"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4","Type":"ContainerStarted","Data":"453e940b80b90eb0ca818acad9f17422383bc68c42d96fbdf44ae974543a52ca"} Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.475150 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfde06fb-eb75-4221-a100-e2315fec4e5c","Type":"ContainerStarted","Data":"b8ab35f11ead8d780fa5ea965b0ad78f6ff051cc4411aa08b3b5c101d7f7274c"} Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.477564 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f06ade1-9dc2-4175-a606-d83dc39d2c24","Type":"ContainerStarted","Data":"608570de0132a08a0806d960b0dd1a1f8511901be6f6038535322589ea75f3de"} Nov 26 15:09:26 crc kubenswrapper[4651]: I1126 15:09:26.481476 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerStarted","Data":"1e525a3b792b9c8fbc3c4dedca2d426b454584001127985ca7f54fc483c5a99e"} Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.490523 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" event={"ID":"86a2f131-c449-4541-9822-75711dee8ad3","Type":"ContainerStarted","Data":"e6ee1006ac27ab0156c31ea55455ea2e3575009349594dabbb480fe3092cd6ad"} Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.496152 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" event={"ID":"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4","Type":"ContainerStarted","Data":"4a4acab3bcd2984c44e8ad06fdad13952da7964ad17a9ef4b79965e76753e9d0"} Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.496283 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.531276 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" podStartSLOduration=2.531229156 podStartE2EDuration="2.531229156s" podCreationTimestamp="2025-11-26 15:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:27.505663587 +0000 UTC m=+1134.931411191" watchObservedRunningTime="2025-11-26 15:09:27.531229156 +0000 UTC m=+1134.956976760" Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.548762 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" podStartSLOduration=3.5487359339999998 podStartE2EDuration="3.548735934s" podCreationTimestamp="2025-11-26 15:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:27.523023632 +0000 UTC m=+1134.948771246" watchObservedRunningTime="2025-11-26 15:09:27.548735934 +0000 UTC m=+1134.974483538" Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.727261 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:27 crc kubenswrapper[4651]: I1126 15:09:27.745271 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.136552 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.136909 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.136960 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.137795 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.137851 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7" gracePeriod=600 Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.556347 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7" exitCode=0 Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.556727 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7"} Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.556788 4651 scope.go:117] "RemoveContainer" containerID="1bed2bd078ae425b6996e470a55f2b4cd2080217fee4c7bfa79d544ccd51cf36" Nov 26 15:09:29 crc kubenswrapper[4651]: I1126 15:09:29.915996 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.565467 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerStarted","Data":"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.566541 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerStarted","Data":"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.566661 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-log" containerID="cri-o://3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" gracePeriod=30 Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.569852 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-metadata" containerID="cri-o://a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" gracePeriod=30 Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.575827 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfde06fb-eb75-4221-a100-e2315fec4e5c","Type":"ContainerStarted","Data":"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.580809 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.583108 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f06ade1-9dc2-4175-a606-d83dc39d2c24","Type":"ContainerStarted","Data":"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.583229 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363" gracePeriod=30 Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.587600 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerStarted","Data":"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.587646 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerStarted","Data":"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0"} Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.610886 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.372512925 podStartE2EDuration="7.610861196s" podCreationTimestamp="2025-11-26 15:09:23 +0000 UTC" firstStartedPulling="2025-11-26 15:09:25.127373354 +0000 UTC m=+1132.553120958" lastFinishedPulling="2025-11-26 15:09:29.365721625 +0000 UTC m=+1136.791469229" observedRunningTime="2025-11-26 15:09:30.601521171 +0000 UTC m=+1138.027268795" watchObservedRunningTime="2025-11-26 15:09:30.610861196 +0000 UTC m=+1138.036608800" Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.620424 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.920810522 podStartE2EDuration="7.620410337s" podCreationTimestamp="2025-11-26 15:09:23 +0000 UTC" firstStartedPulling="2025-11-26 15:09:25.7001578 +0000 UTC m=+1133.125905404" lastFinishedPulling="2025-11-26 15:09:29.399757615 +0000 UTC m=+1136.825505219" observedRunningTime="2025-11-26 15:09:30.617520698 +0000 UTC m=+1138.043268312" watchObservedRunningTime="2025-11-26 15:09:30.620410337 +0000 UTC m=+1138.046157941" Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.638850 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.726611757 podStartE2EDuration="6.63883203s" podCreationTimestamp="2025-11-26 15:09:24 +0000 UTC" firstStartedPulling="2025-11-26 15:09:25.490534894 +0000 UTC m=+1132.916282498" lastFinishedPulling="2025-11-26 15:09:29.402755167 +0000 UTC m=+1136.828502771" observedRunningTime="2025-11-26 15:09:30.634902522 +0000 UTC m=+1138.060650126" watchObservedRunningTime="2025-11-26 15:09:30.63883203 +0000 UTC m=+1138.064579634" Nov 26 15:09:30 crc kubenswrapper[4651]: I1126 15:09:30.687421 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8826033779999998 podStartE2EDuration="6.687399947s" podCreationTimestamp="2025-11-26 15:09:24 +0000 UTC" firstStartedPulling="2025-11-26 15:09:25.642559976 +0000 UTC m=+1133.068307580" lastFinishedPulling="2025-11-26 15:09:29.447356545 +0000 UTC m=+1136.873104149" observedRunningTime="2025-11-26 15:09:30.685632849 +0000 UTC m=+1138.111380463" watchObservedRunningTime="2025-11-26 15:09:30.687399947 +0000 UTC m=+1138.113147551" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.162769 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.234514 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle\") pod \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.234624 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data\") pod \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.234691 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs\") pod \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.234804 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjhl5\" (UniqueName: \"kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5\") pod \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\" (UID: \"d11b506d-3f44-4b5b-bc10-f3b956b719bf\") " Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.235496 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs" (OuterVolumeSpecName: "logs") pod "d11b506d-3f44-4b5b-bc10-f3b956b719bf" (UID: "d11b506d-3f44-4b5b-bc10-f3b956b719bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.237360 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d11b506d-3f44-4b5b-bc10-f3b956b719bf-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.242090 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5" (OuterVolumeSpecName: "kube-api-access-mjhl5") pod "d11b506d-3f44-4b5b-bc10-f3b956b719bf" (UID: "d11b506d-3f44-4b5b-bc10-f3b956b719bf"). InnerVolumeSpecName "kube-api-access-mjhl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.339587 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjhl5\" (UniqueName: \"kubernetes.io/projected/d11b506d-3f44-4b5b-bc10-f3b956b719bf-kube-api-access-mjhl5\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.383533 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d11b506d-3f44-4b5b-bc10-f3b956b719bf" (UID: "d11b506d-3f44-4b5b-bc10-f3b956b719bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.386591 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data" (OuterVolumeSpecName: "config-data") pod "d11b506d-3f44-4b5b-bc10-f3b956b719bf" (UID: "d11b506d-3f44-4b5b-bc10-f3b956b719bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.441695 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.442110 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11b506d-3f44-4b5b-bc10-f3b956b719bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.597971 4651 generic.go:334] "Generic (PLEG): container finished" podID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerID="a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" exitCode=0 Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.598955 4651 generic.go:334] "Generic (PLEG): container finished" podID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerID="3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" exitCode=143 Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.598067 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerDied","Data":"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64"} Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.599187 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerDied","Data":"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d"} Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.599223 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d11b506d-3f44-4b5b-bc10-f3b956b719bf","Type":"ContainerDied","Data":"3f23a0f6894062e1d8514637db40d67cf7dd866781cc0f6af004434b9ade85cb"} Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.599240 4651 scope.go:117] "RemoveContainer" containerID="a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.598074 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.636232 4651 scope.go:117] "RemoveContainer" containerID="3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.636555 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.649230 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.681094 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:31 crc kubenswrapper[4651]: E1126 15:09:31.681511 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-log" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.681528 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-log" Nov 26 15:09:31 crc kubenswrapper[4651]: E1126 15:09:31.681548 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-metadata" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.681554 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-metadata" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.681733 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-metadata" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.681750 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" containerName="nova-metadata-log" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.682734 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.683923 4651 scope.go:117] "RemoveContainer" containerID="a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.684486 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 15:09:31 crc kubenswrapper[4651]: E1126 15:09:31.684756 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64\": container with ID starting with a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64 not found: ID does not exist" containerID="a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.684870 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64"} err="failed to get container status \"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64\": rpc error: code = NotFound desc = could not find container \"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64\": container with ID starting with a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64 not found: ID does not exist" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.684969 4651 scope.go:117] "RemoveContainer" containerID="3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" Nov 26 15:09:31 crc kubenswrapper[4651]: E1126 15:09:31.688486 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d\": container with ID starting with 3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d not found: ID does not exist" containerID="3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.688709 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d"} err="failed to get container status \"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d\": rpc error: code = NotFound desc = could not find container \"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d\": container with ID starting with 3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d not found: ID does not exist" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.688810 4651 scope.go:117] "RemoveContainer" containerID="a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.689832 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64"} err="failed to get container status \"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64\": rpc error: code = NotFound desc = could not find container \"a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64\": container with ID starting with a4ecdea32119e43ea3d3ff25956949e081ceee05135061783e4a857285ebcf64 not found: ID does not exist" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.689876 4651 scope.go:117] "RemoveContainer" containerID="3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.691418 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d"} err="failed to get container status \"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d\": rpc error: code = NotFound desc = could not find container \"3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d\": container with ID starting with 3e9b4d8b75d0af7de8ada24175e9a39516437dc707e5d8f5de655c388668a87d not found: ID does not exist" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.693467 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.693715 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.748277 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.748496 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.748588 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwjx\" (UniqueName: \"kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.748727 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.748855 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.850377 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.850742 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.850922 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.851023 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.851156 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwjx\" (UniqueName: \"kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.852808 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.857615 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.858939 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.865006 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:31 crc kubenswrapper[4651]: I1126 15:09:31.874002 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwjx\" (UniqueName: \"kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx\") pod \"nova-metadata-0\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " pod="openstack/nova-metadata-0" Nov 26 15:09:32 crc kubenswrapper[4651]: I1126 15:09:32.004077 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:32 crc kubenswrapper[4651]: I1126 15:09:32.529808 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:32 crc kubenswrapper[4651]: W1126 15:09:32.537202 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c23199d_dbc9_4ba3_b993_648ef41a976f.slice/crio-256fa58353df1625712e68135bc3ce66e798ad8cb038371f33c72f0ab7ce56e1 WatchSource:0}: Error finding container 256fa58353df1625712e68135bc3ce66e798ad8cb038371f33c72f0ab7ce56e1: Status 404 returned error can't find the container with id 256fa58353df1625712e68135bc3ce66e798ad8cb038371f33c72f0ab7ce56e1 Nov 26 15:09:32 crc kubenswrapper[4651]: I1126 15:09:32.617888 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerStarted","Data":"256fa58353df1625712e68135bc3ce66e798ad8cb038371f33c72f0ab7ce56e1"} Nov 26 15:09:33 crc kubenswrapper[4651]: I1126 15:09:33.419498 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11b506d-3f44-4b5b-bc10-f3b956b719bf" path="/var/lib/kubelet/pods/d11b506d-3f44-4b5b-bc10-f3b956b719bf/volumes" Nov 26 15:09:33 crc kubenswrapper[4651]: I1126 15:09:33.628432 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerStarted","Data":"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8"} Nov 26 15:09:33 crc kubenswrapper[4651]: I1126 15:09:33.628472 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerStarted","Data":"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc"} Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.598996 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.600229 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.629802 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.657609 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.657657 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.660278 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.660247075 podStartE2EDuration="3.660247075s" podCreationTimestamp="2025-11-26 15:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:33.655997094 +0000 UTC m=+1141.081744698" watchObservedRunningTime="2025-11-26 15:09:34.660247075 +0000 UTC m=+1142.085994679" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.670312 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.671182 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.722996 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.783056 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:09:34 crc kubenswrapper[4651]: I1126 15:09:34.783295 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57fff66767-89s67" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="dnsmasq-dns" containerID="cri-o://07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34" gracePeriod=10 Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.482731 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.559981 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntppr\" (UniqueName: \"kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr\") pod \"615ee6d9-0216-4f0a-b9ea-579fc268806e\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.560426 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb\") pod \"615ee6d9-0216-4f0a-b9ea-579fc268806e\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.560745 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc\") pod \"615ee6d9-0216-4f0a-b9ea-579fc268806e\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.560879 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb\") pod \"615ee6d9-0216-4f0a-b9ea-579fc268806e\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.561462 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config\") pod \"615ee6d9-0216-4f0a-b9ea-579fc268806e\" (UID: \"615ee6d9-0216-4f0a-b9ea-579fc268806e\") " Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.574155 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr" (OuterVolumeSpecName: "kube-api-access-ntppr") pod "615ee6d9-0216-4f0a-b9ea-579fc268806e" (UID: "615ee6d9-0216-4f0a-b9ea-579fc268806e"). InnerVolumeSpecName "kube-api-access-ntppr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.655112 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "615ee6d9-0216-4f0a-b9ea-579fc268806e" (UID: "615ee6d9-0216-4f0a-b9ea-579fc268806e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.656609 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config" (OuterVolumeSpecName: "config") pod "615ee6d9-0216-4f0a-b9ea-579fc268806e" (UID: "615ee6d9-0216-4f0a-b9ea-579fc268806e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.660986 4651 generic.go:334] "Generic (PLEG): container finished" podID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerID="07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34" exitCode=0 Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.661100 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-89s67" event={"ID":"615ee6d9-0216-4f0a-b9ea-579fc268806e","Type":"ContainerDied","Data":"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34"} Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.661168 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fff66767-89s67" event={"ID":"615ee6d9-0216-4f0a-b9ea-579fc268806e","Type":"ContainerDied","Data":"135a6a74852445d41dad7330fcd50167b9617fae108173f96b238547fd3d2a37"} Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.661193 4651 scope.go:117] "RemoveContainer" containerID="07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.661483 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fff66767-89s67" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.661859 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "615ee6d9-0216-4f0a-b9ea-579fc268806e" (UID: "615ee6d9-0216-4f0a-b9ea-579fc268806e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.666092 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntppr\" (UniqueName: \"kubernetes.io/projected/615ee6d9-0216-4f0a-b9ea-579fc268806e-kube-api-access-ntppr\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.666268 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.666349 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.666422 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.668975 4651 generic.go:334] "Generic (PLEG): container finished" podID="ad85fcab-3573-4019-89bc-f35413ff0a9d" containerID="9eec96ebad56c4bde87309892af8528ac22137803befe3b50c792a3509d4efc1" exitCode=0 Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.669835 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tpjxc" event={"ID":"ad85fcab-3573-4019-89bc-f35413ff0a9d","Type":"ContainerDied","Data":"9eec96ebad56c4bde87309892af8528ac22137803befe3b50c792a3509d4efc1"} Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.696358 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "615ee6d9-0216-4f0a-b9ea-579fc268806e" (UID: "615ee6d9-0216-4f0a-b9ea-579fc268806e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.713230 4651 scope.go:117] "RemoveContainer" containerID="1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.743564 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.743638 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.745113 4651 scope.go:117] "RemoveContainer" containerID="07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34" Nov 26 15:09:35 crc kubenswrapper[4651]: E1126 15:09:35.745784 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34\": container with ID starting with 07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34 not found: ID does not exist" containerID="07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.745821 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34"} err="failed to get container status \"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34\": rpc error: code = NotFound desc = could not find container \"07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34\": container with ID starting with 07770b745dd2035d71f3167553fe4cdc481fe18c779b8eb979c2262aa626fc34 not found: ID does not exist" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.745846 4651 scope.go:117] "RemoveContainer" containerID="1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9" Nov 26 15:09:35 crc kubenswrapper[4651]: E1126 15:09:35.746116 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9\": container with ID starting with 1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9 not found: ID does not exist" containerID="1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.746143 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9"} err="failed to get container status \"1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9\": rpc error: code = NotFound desc = could not find container \"1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9\": container with ID starting with 1dddc5099281e828e58cb406e12de033cb05095a3ad464954ef35f6aeca334b9 not found: ID does not exist" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.768922 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615ee6d9-0216-4f0a-b9ea-579fc268806e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:35 crc kubenswrapper[4651]: I1126 15:09:35.999722 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.008260 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57fff66767-89s67"] Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.642350 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.679746 4651 generic.go:334] "Generic (PLEG): container finished" podID="86a2f131-c449-4541-9822-75711dee8ad3" containerID="e6ee1006ac27ab0156c31ea55455ea2e3575009349594dabbb480fe3092cd6ad" exitCode=0 Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.679836 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" event={"ID":"86a2f131-c449-4541-9822-75711dee8ad3","Type":"ContainerDied","Data":"e6ee1006ac27ab0156c31ea55455ea2e3575009349594dabbb480fe3092cd6ad"} Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.683221 4651 generic.go:334] "Generic (PLEG): container finished" podID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerID="fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88" exitCode=137 Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.683259 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.683276 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerDied","Data":"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88"} Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.683321 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de01eb81-c2a9-4cb9-88e6-ee8484accc7b","Type":"ContainerDied","Data":"24eaace5342091f657b0f26a66c875c216ba8f91e882d1e12d2f1c7dc9079a95"} Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.683353 4651 scope.go:117] "RemoveContainer" containerID="fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.744058 4651 scope.go:117] "RemoveContainer" containerID="076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.775228 4651 scope.go:117] "RemoveContainer" containerID="420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.795949 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796077 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796165 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796235 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796270 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796331 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.796359 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mbg\" (UniqueName: \"kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg\") pod \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\" (UID: \"de01eb81-c2a9-4cb9-88e6-ee8484accc7b\") " Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.797622 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.798238 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.812525 4651 scope.go:117] "RemoveContainer" containerID="d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.813571 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts" (OuterVolumeSpecName: "scripts") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.818319 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg" (OuterVolumeSpecName: "kube-api-access-k9mbg") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "kube-api-access-k9mbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.861596 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.900328 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.900368 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.900377 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.900386 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.900399 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mbg\" (UniqueName: \"kubernetes.io/projected/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-kube-api-access-k9mbg\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.903028 4651 scope.go:117] "RemoveContainer" containerID="fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88" Nov 26 15:09:36 crc kubenswrapper[4651]: E1126 15:09:36.905208 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88\": container with ID starting with fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88 not found: ID does not exist" containerID="fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.905245 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88"} err="failed to get container status \"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88\": rpc error: code = NotFound desc = could not find container \"fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88\": container with ID starting with fd5ad26777c4d8dbe330012d35461571b017d04100eb59c83e46ba169cc84f88 not found: ID does not exist" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.905269 4651 scope.go:117] "RemoveContainer" containerID="076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c" Nov 26 15:09:36 crc kubenswrapper[4651]: E1126 15:09:36.906247 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c\": container with ID starting with 076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c not found: ID does not exist" containerID="076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.906280 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c"} err="failed to get container status \"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c\": rpc error: code = NotFound desc = could not find container \"076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c\": container with ID starting with 076f1292e8a4646c17232db48a3cdcf17145ec97131a7a559db05765cb57646c not found: ID does not exist" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.906299 4651 scope.go:117] "RemoveContainer" containerID="420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.906408 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:36 crc kubenswrapper[4651]: E1126 15:09:36.906817 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d\": container with ID starting with 420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d not found: ID does not exist" containerID="420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.906843 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d"} err="failed to get container status \"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d\": rpc error: code = NotFound desc = could not find container \"420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d\": container with ID starting with 420fdf9ba3a4b3f84685431bf4c35ad12725802f14cf670a021f96763b44977d not found: ID does not exist" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.906861 4651 scope.go:117] "RemoveContainer" containerID="d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1" Nov 26 15:09:36 crc kubenswrapper[4651]: E1126 15:09:36.907139 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1\": container with ID starting with d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1 not found: ID does not exist" containerID="d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1" Nov 26 15:09:36 crc kubenswrapper[4651]: I1126 15:09:36.907163 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1"} err="failed to get container status \"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1\": rpc error: code = NotFound desc = could not find container \"d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1\": container with ID starting with d09d5bc0dde703397a6a23c85841fdf755380937943c0a634e70270a71f701f1 not found: ID does not exist" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.003692 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.008156 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.008225 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.048480 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data" (OuterVolumeSpecName: "config-data") pod "de01eb81-c2a9-4cb9-88e6-ee8484accc7b" (UID: "de01eb81-c2a9-4cb9-88e6-ee8484accc7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.105262 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de01eb81-c2a9-4cb9-88e6-ee8484accc7b-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.160432 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.311760 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts\") pod \"ad85fcab-3573-4019-89bc-f35413ff0a9d\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.311871 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvtl\" (UniqueName: \"kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl\") pod \"ad85fcab-3573-4019-89bc-f35413ff0a9d\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.311941 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data\") pod \"ad85fcab-3573-4019-89bc-f35413ff0a9d\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.312122 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle\") pod \"ad85fcab-3573-4019-89bc-f35413ff0a9d\" (UID: \"ad85fcab-3573-4019-89bc-f35413ff0a9d\") " Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.323790 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl" (OuterVolumeSpecName: "kube-api-access-cjvtl") pod "ad85fcab-3573-4019-89bc-f35413ff0a9d" (UID: "ad85fcab-3573-4019-89bc-f35413ff0a9d"). InnerVolumeSpecName "kube-api-access-cjvtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.324017 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts" (OuterVolumeSpecName: "scripts") pod "ad85fcab-3573-4019-89bc-f35413ff0a9d" (UID: "ad85fcab-3573-4019-89bc-f35413ff0a9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.351158 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad85fcab-3573-4019-89bc-f35413ff0a9d" (UID: "ad85fcab-3573-4019-89bc-f35413ff0a9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.358184 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.362683 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data" (OuterVolumeSpecName: "config-data") pod "ad85fcab-3573-4019-89bc-f35413ff0a9d" (UID: "ad85fcab-3573-4019-89bc-f35413ff0a9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.368948 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389233 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389638 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="sg-core" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389650 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="sg-core" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389661 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="proxy-httpd" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389677 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="proxy-httpd" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389687 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="dnsmasq-dns" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389693 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="dnsmasq-dns" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389702 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad85fcab-3573-4019-89bc-f35413ff0a9d" containerName="nova-manage" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389707 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad85fcab-3573-4019-89bc-f35413ff0a9d" containerName="nova-manage" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389719 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="init" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389724 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="init" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389734 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-notification-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389740 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-notification-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: E1126 15:09:37.389758 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-central-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389766 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-central-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389928 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="proxy-httpd" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389938 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="dnsmasq-dns" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389945 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="sg-core" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389962 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-notification-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389972 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" containerName="ceilometer-central-agent" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.389980 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad85fcab-3573-4019-89bc-f35413ff0a9d" containerName="nova-manage" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.391707 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.397403 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.397628 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.400601 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.414864 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvtl\" (UniqueName: \"kubernetes.io/projected/ad85fcab-3573-4019-89bc-f35413ff0a9d-kube-api-access-cjvtl\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.414894 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.414904 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.414913 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad85fcab-3573-4019-89bc-f35413ff0a9d-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.418278 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" path="/var/lib/kubelet/pods/615ee6d9-0216-4f0a-b9ea-579fc268806e/volumes" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.419282 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de01eb81-c2a9-4cb9-88e6-ee8484accc7b" path="/var/lib/kubelet/pods/de01eb81-c2a9-4cb9-88e6-ee8484accc7b/volumes" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516301 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516355 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fj7g\" (UniqueName: \"kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516384 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516426 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516451 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516701 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.516839 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619067 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619403 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619579 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619707 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fj7g\" (UniqueName: \"kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619809 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.619954 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.620098 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.621321 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.622925 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.623580 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.624254 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.624893 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.625082 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.640797 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fj7g\" (UniqueName: \"kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g\") pod \"ceilometer-0\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.704068 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tpjxc" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.704076 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tpjxc" event={"ID":"ad85fcab-3573-4019-89bc-f35413ff0a9d","Type":"ContainerDied","Data":"b974712bc08c246a869f9ba6b0cbdbc26d1cc34162b83cf4bc81cbe42249cd92"} Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.704122 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b974712bc08c246a869f9ba6b0cbdbc26d1cc34162b83cf4bc81cbe42249cd92" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.711517 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.829589 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.829830 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-log" containerID="cri-o://4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0" gracePeriod=30 Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.830281 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-api" containerID="cri-o://7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0" gracePeriod=30 Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.839768 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.840115 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerName="nova-scheduler-scheduler" containerID="cri-o://da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" gracePeriod=30 Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.857556 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.857772 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-log" containerID="cri-o://5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" gracePeriod=30 Nov 26 15:09:37 crc kubenswrapper[4651]: I1126 15:09:37.858196 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-metadata" containerID="cri-o://344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" gracePeriod=30 Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.328799 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.441680 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:09:38 crc kubenswrapper[4651]: W1126 15:09:38.444526 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d3aa80_0713_402e_96de_d15177f6e7b2.slice/crio-fc8f8757318b49ab27d2d4950fb2d3f84a5a916a63347e795aa92325c37060ea WatchSource:0}: Error finding container fc8f8757318b49ab27d2d4950fb2d3f84a5a916a63347e795aa92325c37060ea: Status 404 returned error can't find the container with id fc8f8757318b49ab27d2d4950fb2d3f84a5a916a63347e795aa92325c37060ea Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.449143 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data\") pod \"86a2f131-c449-4541-9822-75711dee8ad3\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.449371 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl4lt\" (UniqueName: \"kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt\") pod \"86a2f131-c449-4541-9822-75711dee8ad3\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.450858 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts\") pod \"86a2f131-c449-4541-9822-75711dee8ad3\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.451018 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle\") pod \"86a2f131-c449-4541-9822-75711dee8ad3\" (UID: \"86a2f131-c449-4541-9822-75711dee8ad3\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.455220 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt" (OuterVolumeSpecName: "kube-api-access-zl4lt") pod "86a2f131-c449-4541-9822-75711dee8ad3" (UID: "86a2f131-c449-4541-9822-75711dee8ad3"). InnerVolumeSpecName "kube-api-access-zl4lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.464944 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts" (OuterVolumeSpecName: "scripts") pod "86a2f131-c449-4541-9822-75711dee8ad3" (UID: "86a2f131-c449-4541-9822-75711dee8ad3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.495959 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86a2f131-c449-4541-9822-75711dee8ad3" (UID: "86a2f131-c449-4541-9822-75711dee8ad3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.505153 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data" (OuterVolumeSpecName: "config-data") pod "86a2f131-c449-4541-9822-75711dee8ad3" (UID: "86a2f131-c449-4541-9822-75711dee8ad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.555400 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.555710 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl4lt\" (UniqueName: \"kubernetes.io/projected/86a2f131-c449-4541-9822-75711dee8ad3-kube-api-access-zl4lt\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.555806 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.555875 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a2f131-c449-4541-9822-75711dee8ad3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.690745 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.735790 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerStarted","Data":"fc8f8757318b49ab27d2d4950fb2d3f84a5a916a63347e795aa92325c37060ea"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739728 4651 generic.go:334] "Generic (PLEG): container finished" podID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerID="344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" exitCode=0 Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739756 4651 generic.go:334] "Generic (PLEG): container finished" podID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerID="5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" exitCode=143 Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739793 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerDied","Data":"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739819 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerDied","Data":"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739828 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c23199d-dbc9-4ba3-b993-648ef41a976f","Type":"ContainerDied","Data":"256fa58353df1625712e68135bc3ce66e798ad8cb038371f33c72f0ab7ce56e1"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739842 4651 scope.go:117] "RemoveContainer" containerID="344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.739958 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.746976 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" event={"ID":"86a2f131-c449-4541-9822-75711dee8ad3","Type":"ContainerDied","Data":"6ca2e021e879a39af180b2ead0195b0933490ae1da77ea0e233e90766f4e01c4"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.747026 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ca2e021e879a39af180b2ead0195b0933490ae1da77ea0e233e90766f4e01c4" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.747135 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qmd5q" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.799343 4651 generic.go:334] "Generic (PLEG): container finished" podID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerID="4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0" exitCode=143 Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.799396 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerDied","Data":"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0"} Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810291 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:09:38 crc kubenswrapper[4651]: E1126 15:09:38.810668 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a2f131-c449-4541-9822-75711dee8ad3" containerName="nova-cell1-conductor-db-sync" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810686 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a2f131-c449-4541-9822-75711dee8ad3" containerName="nova-cell1-conductor-db-sync" Nov 26 15:09:38 crc kubenswrapper[4651]: E1126 15:09:38.810721 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-log" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810728 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-log" Nov 26 15:09:38 crc kubenswrapper[4651]: E1126 15:09:38.810737 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-metadata" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810743 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-metadata" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810888 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-metadata" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810901 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a2f131-c449-4541-9822-75711dee8ad3" containerName="nova-cell1-conductor-db-sync" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.810910 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" containerName="nova-metadata-log" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.811506 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.821499 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.836524 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.851199 4651 scope.go:117] "RemoveContainer" containerID="5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.861921 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs\") pod \"0c23199d-dbc9-4ba3-b993-648ef41a976f\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.862065 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data\") pod \"0c23199d-dbc9-4ba3-b993-648ef41a976f\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.862128 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs\") pod \"0c23199d-dbc9-4ba3-b993-648ef41a976f\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.862152 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcwjx\" (UniqueName: \"kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx\") pod \"0c23199d-dbc9-4ba3-b993-648ef41a976f\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.862178 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle\") pod \"0c23199d-dbc9-4ba3-b993-648ef41a976f\" (UID: \"0c23199d-dbc9-4ba3-b993-648ef41a976f\") " Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.863754 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs" (OuterVolumeSpecName: "logs") pod "0c23199d-dbc9-4ba3-b993-648ef41a976f" (UID: "0c23199d-dbc9-4ba3-b993-648ef41a976f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.869146 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx" (OuterVolumeSpecName: "kube-api-access-xcwjx") pod "0c23199d-dbc9-4ba3-b993-648ef41a976f" (UID: "0c23199d-dbc9-4ba3-b993-648ef41a976f"). InnerVolumeSpecName "kube-api-access-xcwjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.900740 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data" (OuterVolumeSpecName: "config-data") pod "0c23199d-dbc9-4ba3-b993-648ef41a976f" (UID: "0c23199d-dbc9-4ba3-b993-648ef41a976f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.902293 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c23199d-dbc9-4ba3-b993-648ef41a976f" (UID: "0c23199d-dbc9-4ba3-b993-648ef41a976f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.903839 4651 scope.go:117] "RemoveContainer" containerID="344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" Nov 26 15:09:38 crc kubenswrapper[4651]: E1126 15:09:38.915470 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8\": container with ID starting with 344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8 not found: ID does not exist" containerID="344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.915523 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8"} err="failed to get container status \"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8\": rpc error: code = NotFound desc = could not find container \"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8\": container with ID starting with 344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8 not found: ID does not exist" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.915548 4651 scope.go:117] "RemoveContainer" containerID="5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" Nov 26 15:09:38 crc kubenswrapper[4651]: E1126 15:09:38.917858 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc\": container with ID starting with 5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc not found: ID does not exist" containerID="5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.917890 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc"} err="failed to get container status \"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc\": rpc error: code = NotFound desc = could not find container \"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc\": container with ID starting with 5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc not found: ID does not exist" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.917906 4651 scope.go:117] "RemoveContainer" containerID="344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.918206 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8"} err="failed to get container status \"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8\": rpc error: code = NotFound desc = could not find container \"344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8\": container with ID starting with 344801a3c88eeddf8c30b8b8dc2508b7e4831e5e25c9eb24ed96eff3eb8826f8 not found: ID does not exist" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.918222 4651 scope.go:117] "RemoveContainer" containerID="5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.918717 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc"} err="failed to get container status \"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc\": rpc error: code = NotFound desc = could not find container \"5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc\": container with ID starting with 5ce03882b61640191493edce634d98e30a3b3b8cc2d46c84a3346664d5e2f0bc not found: ID does not exist" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.965734 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.965967 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.965995 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2bm\" (UniqueName: \"kubernetes.io/projected/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-kube-api-access-tf2bm\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.966085 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c23199d-dbc9-4ba3-b993-648ef41a976f-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.966095 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcwjx\" (UniqueName: \"kubernetes.io/projected/0c23199d-dbc9-4ba3-b993-648ef41a976f-kube-api-access-xcwjx\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.966105 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.966116 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:38 crc kubenswrapper[4651]: I1126 15:09:38.976834 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c23199d-dbc9-4ba3-b993-648ef41a976f" (UID: "0c23199d-dbc9-4ba3-b993-648ef41a976f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.067320 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.067358 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2bm\" (UniqueName: \"kubernetes.io/projected/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-kube-api-access-tf2bm\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.067407 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.067558 4651 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c23199d-dbc9-4ba3-b993-648ef41a976f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.072754 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.077620 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.088917 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.099195 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.123869 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2bm\" (UniqueName: \"kubernetes.io/projected/d317ad85-fe1e-4e2d-b7ac-745c2efb8a44-kube-api-access-tf2bm\") pod \"nova-cell1-conductor-0\" (UID: \"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44\") " pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.147914 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.163533 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.176243 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.178733 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.178964 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.202118 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.271457 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.271765 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.271792 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.271892 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.271970 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmqct\" (UniqueName: \"kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.373604 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.373662 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.373701 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.373794 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.373873 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmqct\" (UniqueName: \"kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.374141 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.381289 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.381367 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.392081 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.394027 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmqct\" (UniqueName: \"kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct\") pod \"nova-metadata-0\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.433694 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c23199d-dbc9-4ba3-b993-648ef41a976f" path="/var/lib/kubelet/pods/0c23199d-dbc9-4ba3-b993-648ef41a976f/volumes" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.503505 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:09:39 crc kubenswrapper[4651]: E1126 15:09:39.618628 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:09:39 crc kubenswrapper[4651]: E1126 15:09:39.620300 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:09:39 crc kubenswrapper[4651]: E1126 15:09:39.642586 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:09:39 crc kubenswrapper[4651]: E1126 15:09:39.642649 4651 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerName="nova-scheduler-scheduler" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.677359 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.829756 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44","Type":"ContainerStarted","Data":"af36109c2d8753d0ffd723504dbd4d1623b690f56e467a306ea0a448f0a422d4"} Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.836686 4651 generic.go:334] "Generic (PLEG): container finished" podID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerID="a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef" exitCode=137 Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.836734 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerDied","Data":"a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef"} Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.836760 4651 scope.go:117] "RemoveContainer" containerID="bc932f0bacd9c20ebf1824e9687b4f2688afd1574336c4d92ff1fad88d1f5394" Nov 26 15:09:39 crc kubenswrapper[4651]: I1126 15:09:39.842946 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerStarted","Data":"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.026970 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.227001 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57fff66767-89s67" podUID="615ee6d9-0216-4f0a-b9ea-579fc268806e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: i/o timeout" Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.852720 4651 generic.go:334] "Generic (PLEG): container finished" podID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerID="b24d36253d1184088df8f38e2aa41ad3371af1bbbe82d56ef4835ace475fee82" exitCode=137 Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.852804 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerDied","Data":"b24d36253d1184088df8f38e2aa41ad3371af1bbbe82d56ef4835ace475fee82"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.853085 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f54c7c77d-rx8gm" event={"ID":"5c09de21-84b0-440d-b34c-3054ec6741fc","Type":"ContainerStarted","Data":"21876ff057c2c77be9a51b05e503fff2b01a8e554537ff835f53e6dcd5e462a5"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.853106 4651 scope.go:117] "RemoveContainer" containerID="56761142c110a594c6d6a7518e9e4944e0f87669709325bcff97b8c278e4b419" Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.857323 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerStarted","Data":"e459d337cfdf21c6171a193e9e9d70d57ce29ab97edf0ea60127ef435043b603"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.860512 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerStarted","Data":"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.862831 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerStarted","Data":"8c9fe5be740e9003884ecb7d4016fca9c33b14d93ae801df5652b5720280676e"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.862961 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerStarted","Data":"45ba5535542b59701406caadd2410eea4b79aae4fde3b5ba66e91d74fb60bc2b"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.863063 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerStarted","Data":"7a5f723fe2aefe63370b51ebef592599cbe7889e927abc9f3de1257781a8b439"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.864601 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d317ad85-fe1e-4e2d-b7ac-745c2efb8a44","Type":"ContainerStarted","Data":"5ce848c9e1db586a7426989d65e79aa95803d6b95b91e233458ba7ccde2bf1a5"} Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.864745 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.952729 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.952708645 podStartE2EDuration="1.952708645s" podCreationTimestamp="2025-11-26 15:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:40.925587154 +0000 UTC m=+1148.351334778" watchObservedRunningTime="2025-11-26 15:09:40.952708645 +0000 UTC m=+1148.378456249" Nov 26 15:09:40 crc kubenswrapper[4651]: I1126 15:09:40.990750 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.990731573 podStartE2EDuration="2.990731573s" podCreationTimestamp="2025-11-26 15:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:40.988094831 +0000 UTC m=+1148.413842435" watchObservedRunningTime="2025-11-26 15:09:40.990731573 +0000 UTC m=+1148.416479177" Nov 26 15:09:41 crc kubenswrapper[4651]: I1126 15:09:41.316602 4651 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:09:41 crc kubenswrapper[4651]: I1126 15:09:41.875192 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerStarted","Data":"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264"} Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.517753 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.679788 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data\") pod \"dfde06fb-eb75-4221-a100-e2315fec4e5c\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.680881 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle\") pod \"dfde06fb-eb75-4221-a100-e2315fec4e5c\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.681090 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz8x2\" (UniqueName: \"kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2\") pod \"dfde06fb-eb75-4221-a100-e2315fec4e5c\" (UID: \"dfde06fb-eb75-4221-a100-e2315fec4e5c\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.716302 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2" (OuterVolumeSpecName: "kube-api-access-sz8x2") pod "dfde06fb-eb75-4221-a100-e2315fec4e5c" (UID: "dfde06fb-eb75-4221-a100-e2315fec4e5c"). InnerVolumeSpecName "kube-api-access-sz8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.737794 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data" (OuterVolumeSpecName: "config-data") pod "dfde06fb-eb75-4221-a100-e2315fec4e5c" (UID: "dfde06fb-eb75-4221-a100-e2315fec4e5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.740857 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfde06fb-eb75-4221-a100-e2315fec4e5c" (UID: "dfde06fb-eb75-4221-a100-e2315fec4e5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.793729 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.793761 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz8x2\" (UniqueName: \"kubernetes.io/projected/dfde06fb-eb75-4221-a100-e2315fec4e5c-kube-api-access-sz8x2\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.793773 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfde06fb-eb75-4221-a100-e2315fec4e5c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.881339 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.914358 4651 generic.go:334] "Generic (PLEG): container finished" podID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" exitCode=0 Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.914459 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfde06fb-eb75-4221-a100-e2315fec4e5c","Type":"ContainerDied","Data":"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a"} Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.914489 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dfde06fb-eb75-4221-a100-e2315fec4e5c","Type":"ContainerDied","Data":"b8ab35f11ead8d780fa5ea965b0ad78f6ff051cc4411aa08b3b5c101d7f7274c"} Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.914510 4651 scope.go:117] "RemoveContainer" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.914722 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.956148 4651 generic.go:334] "Generic (PLEG): container finished" podID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerID="7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0" exitCode=0 Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.956199 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerDied","Data":"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0"} Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.956225 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59","Type":"ContainerDied","Data":"1e525a3b792b9c8fbc3c4dedca2d426b454584001127985ca7f54fc483c5a99e"} Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.956357 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.977811 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.980453 4651 scope.go:117] "RemoveContainer" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" Nov 26 15:09:42 crc kubenswrapper[4651]: E1126 15:09:42.981668 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a\": container with ID starting with da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a not found: ID does not exist" containerID="da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.981709 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a"} err="failed to get container status \"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a\": rpc error: code = NotFound desc = could not find container \"da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a\": container with ID starting with da8509a0cbec21f9c2a339c2d682a1779a95b1c8064e4578ab931d8e5d8d680a not found: ID does not exist" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.981729 4651 scope.go:117] "RemoveContainer" containerID="7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0" Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.998702 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data\") pod \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.999171 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle\") pod \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.999307 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk8hd\" (UniqueName: \"kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd\") pod \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " Nov 26 15:09:42 crc kubenswrapper[4651]: I1126 15:09:42.999435 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs\") pod \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\" (UID: \"c0e2eeb8-87bd-40d3-a874-9ed50caf4b59\") " Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.000382 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs" (OuterVolumeSpecName: "logs") pod "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" (UID: "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.007801 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.030704 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.031244 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerName="nova-scheduler-scheduler" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031271 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerName="nova-scheduler-scheduler" Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.031299 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-log" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031308 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-log" Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.031323 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-api" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031330 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-api" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031543 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-api" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031573 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" containerName="nova-api-log" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031587 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" containerName="nova-scheduler-scheduler" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.031723 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd" (OuterVolumeSpecName: "kube-api-access-dk8hd") pod "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" (UID: "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59"). InnerVolumeSpecName "kube-api-access-dk8hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.032482 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.037753 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.052273 4651 scope.go:117] "RemoveContainer" containerID="4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.056202 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.074171 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" (UID: "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115029 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxg9f\" (UniqueName: \"kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115204 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115362 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115506 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115573 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.115668 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk8hd\" (UniqueName: \"kubernetes.io/projected/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-kube-api-access-dk8hd\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.144010 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data" (OuterVolumeSpecName: "config-data") pod "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" (UID: "c0e2eeb8-87bd-40d3-a874-9ed50caf4b59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.217720 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxg9f\" (UniqueName: \"kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.218216 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.218415 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.218736 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.235986 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfde06fb_eb75_4221_a100_e2315fec4e5c.slice/crio-b8ab35f11ead8d780fa5ea965b0ad78f6ff051cc4411aa08b3b5c101d7f7274c\": RecentStats: unable to find data in memory cache]" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.237578 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.252638 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxg9f\" (UniqueName: \"kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.252744 4651 scope.go:117] "RemoveContainer" containerID="7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.255008 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.258231 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0\": container with ID starting with 7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0 not found: ID does not exist" containerID="7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.258288 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0"} err="failed to get container status \"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0\": rpc error: code = NotFound desc = could not find container \"7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0\": container with ID starting with 7a34ef3f4df5c8c66bde27a48e3e7d8e8759862709ece7b437525c13a7c75fc0 not found: ID does not exist" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.258318 4651 scope.go:117] "RemoveContainer" containerID="4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0" Nov 26 15:09:43 crc kubenswrapper[4651]: E1126 15:09:43.259809 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0\": container with ID starting with 4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0 not found: ID does not exist" containerID="4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.260106 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0"} err="failed to get container status \"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0\": rpc error: code = NotFound desc = could not find container \"4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0\": container with ID starting with 4cd08c14a7ba40f842d9f6dabe0c2675d5656a076fbaabf1e4497f60cb182ad0 not found: ID does not exist" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.338935 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.348410 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.359352 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.360797 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.371419 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.380794 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.417459 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e2eeb8-87bd-40d3-a874-9ed50caf4b59" path="/var/lib/kubelet/pods/c0e2eeb8-87bd-40d3-a874-9ed50caf4b59/volumes" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.419512 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfde06fb-eb75-4221-a100-e2315fec4e5c" path="/var/lib/kubelet/pods/dfde06fb-eb75-4221-a100-e2315fec4e5c/volumes" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.423477 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.423542 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.423626 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhzf\" (UniqueName: \"kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.431235 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.511444 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.533599 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.533660 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.533687 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.533730 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhzf\" (UniqueName: \"kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.535479 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.540358 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.550599 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.553877 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhzf\" (UniqueName: \"kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf\") pod \"nova-api-0\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " pod="openstack/nova-api-0" Nov 26 15:09:43 crc kubenswrapper[4651]: I1126 15:09:43.705315 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.034166 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerStarted","Data":"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc"} Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.034473 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.112640 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.118717 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.709906067 podStartE2EDuration="7.118694385s" podCreationTimestamp="2025-11-26 15:09:37 +0000 UTC" firstStartedPulling="2025-11-26 15:09:38.447235447 +0000 UTC m=+1145.872983061" lastFinishedPulling="2025-11-26 15:09:42.856023775 +0000 UTC m=+1150.281771379" observedRunningTime="2025-11-26 15:09:44.061609225 +0000 UTC m=+1151.487356839" watchObservedRunningTime="2025-11-26 15:09:44.118694385 +0000 UTC m=+1151.544441989" Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.205397 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.270719 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.503809 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:09:44 crc kubenswrapper[4651]: I1126 15:09:44.504727 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.128249 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerStarted","Data":"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c"} Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.129721 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerStarted","Data":"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1"} Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.129837 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerStarted","Data":"522575712fa63433baad942fcf26ed8b807e20a419eb919a8f789b305df5e220"} Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.152899 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e94ac87-b21e-4f95-98ac-d97c604aaa30","Type":"ContainerStarted","Data":"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491"} Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.152934 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e94ac87-b21e-4f95-98ac-d97c604aaa30","Type":"ContainerStarted","Data":"e9ee5635a45644b4012fedbeb38fc9dcef6e90326947fe10ac1f5fc74d170d58"} Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.195253 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.194627674 podStartE2EDuration="2.194627674s" podCreationTimestamp="2025-11-26 15:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.160923484 +0000 UTC m=+1152.586671098" watchObservedRunningTime="2025-11-26 15:09:45.194627674 +0000 UTC m=+1152.620375278" Nov 26 15:09:45 crc kubenswrapper[4651]: I1126 15:09:45.202711 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.202689214 podStartE2EDuration="3.202689214s" podCreationTimestamp="2025-11-26 15:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.188857057 +0000 UTC m=+1152.614604661" watchObservedRunningTime="2025-11-26 15:09:45.202689214 +0000 UTC m=+1152.628436838" Nov 26 15:09:46 crc kubenswrapper[4651]: E1126 15:09:46.740581 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openstack/swift-storage-0" podUID="a3b8c2db-ce7f-48ce-9fd1-d55b5583773e" Nov 26 15:09:47 crc kubenswrapper[4651]: I1126 15:09:47.174758 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 15:09:48 crc kubenswrapper[4651]: I1126 15:09:48.512382 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.504528 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.506829 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.611873 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.611922 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.613954 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.781051 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:09:49 crc kubenswrapper[4651]: I1126 15:09:49.781101 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:09:50 crc kubenswrapper[4651]: I1126 15:09:50.279818 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:09:50 crc kubenswrapper[4651]: E1126 15:09:50.280097 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:09:50 crc kubenswrapper[4651]: E1126 15:09:50.280132 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:09:50 crc kubenswrapper[4651]: E1126 15:09:50.280219 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift podName:a3b8c2db-ce7f-48ce-9fd1-d55b5583773e nodeName:}" failed. No retries permitted until 2025-11-26 15:11:52.280190266 +0000 UTC m=+1279.705937870 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift") pod "swift-storage-0" (UID: "a3b8c2db-ce7f-48ce-9fd1-d55b5583773e") : configmap "swift-ring-files" not found Nov 26 15:09:50 crc kubenswrapper[4651]: I1126 15:09:50.517166 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:50 crc kubenswrapper[4651]: I1126 15:09:50.517908 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:53 crc kubenswrapper[4651]: I1126 15:09:53.512274 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:09:53 crc kubenswrapper[4651]: I1126 15:09:53.539892 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:09:53 crc kubenswrapper[4651]: I1126 15:09:53.706760 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:09:53 crc kubenswrapper[4651]: I1126 15:09:53.706810 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:09:54 crc kubenswrapper[4651]: I1126 15:09:54.264563 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:09:54 crc kubenswrapper[4651]: I1126 15:09:54.790277 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:54 crc kubenswrapper[4651]: I1126 15:09:54.790323 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:59 crc kubenswrapper[4651]: I1126 15:09:59.510478 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:09:59 crc kubenswrapper[4651]: I1126 15:09:59.516239 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:09:59 crc kubenswrapper[4651]: I1126 15:09:59.519166 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:09:59 crc kubenswrapper[4651]: I1126 15:09:59.612423 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:09:59 crc kubenswrapper[4651]: I1126 15:09:59.786177 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f54c7c77d-rx8gm" podUID="5c09de21-84b0-440d-b34c-3054ec6741fc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 26 15:10:00 crc kubenswrapper[4651]: I1126 15:10:00.302567 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:10:00 crc kubenswrapper[4651]: I1126 15:10:00.988572 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.106346 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2lgk\" (UniqueName: \"kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk\") pod \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.106442 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data\") pod \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.106463 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle\") pod \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\" (UID: \"9f06ade1-9dc2-4175-a606-d83dc39d2c24\") " Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.123936 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk" (OuterVolumeSpecName: "kube-api-access-s2lgk") pod "9f06ade1-9dc2-4175-a606-d83dc39d2c24" (UID: "9f06ade1-9dc2-4175-a606-d83dc39d2c24"). InnerVolumeSpecName "kube-api-access-s2lgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.134798 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data" (OuterVolumeSpecName: "config-data") pod "9f06ade1-9dc2-4175-a606-d83dc39d2c24" (UID: "9f06ade1-9dc2-4175-a606-d83dc39d2c24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.147489 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f06ade1-9dc2-4175-a606-d83dc39d2c24" (UID: "9f06ade1-9dc2-4175-a606-d83dc39d2c24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.208509 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2lgk\" (UniqueName: \"kubernetes.io/projected/9f06ade1-9dc2-4175-a606-d83dc39d2c24-kube-api-access-s2lgk\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.208550 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.208585 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f06ade1-9dc2-4175-a606-d83dc39d2c24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.305058 4651 generic.go:334] "Generic (PLEG): container finished" podID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" containerID="bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363" exitCode=137 Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.305551 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.305243 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f06ade1-9dc2-4175-a606-d83dc39d2c24","Type":"ContainerDied","Data":"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363"} Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.306488 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f06ade1-9dc2-4175-a606-d83dc39d2c24","Type":"ContainerDied","Data":"608570de0132a08a0806d960b0dd1a1f8511901be6f6038535322589ea75f3de"} Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.306515 4651 scope.go:117] "RemoveContainer" containerID="bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.343028 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.355379 4651 scope.go:117] "RemoveContainer" containerID="bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363" Nov 26 15:10:01 crc kubenswrapper[4651]: E1126 15:10:01.355972 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363\": container with ID starting with bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363 not found: ID does not exist" containerID="bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.355999 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363"} err="failed to get container status \"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363\": rpc error: code = NotFound desc = could not find container \"bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363\": container with ID starting with bd82226acec0c864ae800d415cf9693579c60ea132b739bb80eadcf86b72f363 not found: ID does not exist" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.369989 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.381436 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:10:01 crc kubenswrapper[4651]: E1126 15:10:01.381854 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.381877 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.382176 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" containerName="nova-cell1-novncproxy-novncproxy" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.382790 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.387130 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.394692 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.405381 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.412631 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.412693 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8pc\" (UniqueName: \"kubernetes.io/projected/d5bc7701-632a-44c9-b812-0314af81833a-kube-api-access-7j8pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.412747 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.412843 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.412901 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.422536 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f06ade1-9dc2-4175-a606-d83dc39d2c24" path="/var/lib/kubelet/pods/9f06ade1-9dc2-4175-a606-d83dc39d2c24/volumes" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.423179 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.514415 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.514490 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8pc\" (UniqueName: \"kubernetes.io/projected/d5bc7701-632a-44c9-b812-0314af81833a-kube-api-access-7j8pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.514535 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.514621 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.514671 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.519321 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.522830 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.523292 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.524694 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bc7701-632a-44c9-b812-0314af81833a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.531181 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8pc\" (UniqueName: \"kubernetes.io/projected/d5bc7701-632a-44c9-b812-0314af81833a-kube-api-access-7j8pc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5bc7701-632a-44c9-b812-0314af81833a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:01 crc kubenswrapper[4651]: I1126 15:10:01.709728 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:02 crc kubenswrapper[4651]: I1126 15:10:02.361704 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.327691 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5bc7701-632a-44c9-b812-0314af81833a","Type":"ContainerStarted","Data":"e3a82441c057a53dfd79ed09a88e0b03c8060a7be720413ccaa3492db1928611"} Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.328221 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5bc7701-632a-44c9-b812-0314af81833a","Type":"ContainerStarted","Data":"a88c78749a89cb02f311cc0c9a3161022ca6e562178cd2ce5c999b25aeb29d7e"} Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.351089 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.35105682 podStartE2EDuration="2.35105682s" podCreationTimestamp="2025-11-26 15:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:03.340025876 +0000 UTC m=+1170.765773490" watchObservedRunningTime="2025-11-26 15:10:03.35105682 +0000 UTC m=+1170.776804424" Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.708950 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.709443 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.712177 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:10:03 crc kubenswrapper[4651]: I1126 15:10:03.713201 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.341353 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.345669 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.575818 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.578318 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.616857 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.683170 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.683262 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.683287 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcbb\" (UniqueName: \"kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.683339 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.683378 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.784951 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.785096 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.785181 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.785209 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcbb\" (UniqueName: \"kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.785274 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.786093 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.786981 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.787079 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.787123 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.811398 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcbb\" (UniqueName: \"kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb\") pod \"dnsmasq-dns-95bd95597-dqpwc\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:04 crc kubenswrapper[4651]: I1126 15:10:04.914451 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:05 crc kubenswrapper[4651]: I1126 15:10:05.395408 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:10:05 crc kubenswrapper[4651]: W1126 15:10:05.402165 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d312157_9607_429b_bcb0_c6bc126938a8.slice/crio-4a019315beae501f321c4354412f3bc85a559207f76f815bd93fc8029c0be0f5 WatchSource:0}: Error finding container 4a019315beae501f321c4354412f3bc85a559207f76f815bd93fc8029c0be0f5: Status 404 returned error can't find the container with id 4a019315beae501f321c4354412f3bc85a559207f76f815bd93fc8029c0be0f5 Nov 26 15:10:06 crc kubenswrapper[4651]: I1126 15:10:06.365118 4651 generic.go:334] "Generic (PLEG): container finished" podID="8d312157-9607-429b-bcb0-c6bc126938a8" containerID="61dfba3a45df4f5ed7c88d6c6185d7884ccf14eb957fcb14c61e493e1302fad2" exitCode=0 Nov 26 15:10:06 crc kubenswrapper[4651]: I1126 15:10:06.365177 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" event={"ID":"8d312157-9607-429b-bcb0-c6bc126938a8","Type":"ContainerDied","Data":"61dfba3a45df4f5ed7c88d6c6185d7884ccf14eb957fcb14c61e493e1302fad2"} Nov 26 15:10:06 crc kubenswrapper[4651]: I1126 15:10:06.365470 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" event={"ID":"8d312157-9607-429b-bcb0-c6bc126938a8","Type":"ContainerStarted","Data":"4a019315beae501f321c4354412f3bc85a559207f76f815bd93fc8029c0be0f5"} Nov 26 15:10:06 crc kubenswrapper[4651]: I1126 15:10:06.710641 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.197744 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.198422 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-central-agent" containerID="cri-o://189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.198467 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="sg-core" containerID="cri-o://65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.198479 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" containerID="cri-o://28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.198502 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-notification-agent" containerID="cri-o://77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.210416 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.190:3000/\": EOF" Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.374111 4651 generic.go:334] "Generic (PLEG): container finished" podID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerID="65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264" exitCode=2 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.374166 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerDied","Data":"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264"} Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.376107 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" event={"ID":"8d312157-9607-429b-bcb0-c6bc126938a8","Type":"ContainerStarted","Data":"436eabcc04ab03bec5e3a58f278a272b9c4d282d2fdaf634f3a33425311680dd"} Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.376374 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.483612 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" podStartSLOduration=3.483594042 podStartE2EDuration="3.483594042s" podCreationTimestamp="2025-11-26 15:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:07.404895451 +0000 UTC m=+1174.830643075" watchObservedRunningTime="2025-11-26 15:10:07.483594042 +0000 UTC m=+1174.909341646" Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.489810 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.490019 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-log" containerID="cri-o://0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.490174 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-api" containerID="cri-o://748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c" gracePeriod=30 Nov 26 15:10:07 crc kubenswrapper[4651]: I1126 15:10:07.712625 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.190:3000/\": dial tcp 10.217.0.190:3000: connect: connection refused" Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.387134 4651 generic.go:334] "Generic (PLEG): container finished" podID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerID="28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc" exitCode=0 Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.387176 4651 generic.go:334] "Generic (PLEG): container finished" podID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerID="189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987" exitCode=0 Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.387225 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerDied","Data":"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc"} Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.387258 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerDied","Data":"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987"} Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.390173 4651 generic.go:334] "Generic (PLEG): container finished" podID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerID="0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1" exitCode=143 Nov 26 15:10:08 crc kubenswrapper[4651]: I1126 15:10:08.391126 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerDied","Data":"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1"} Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.137630 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269397 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269506 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fj7g\" (UniqueName: \"kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269584 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269613 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269647 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269678 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.269694 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd\") pod \"c5d3aa80-0713-402e-96de-d15177f6e7b2\" (UID: \"c5d3aa80-0713-402e-96de-d15177f6e7b2\") " Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.270788 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.271974 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.276866 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g" (OuterVolumeSpecName: "kube-api-access-5fj7g") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "kube-api-access-5fj7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.287405 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts" (OuterVolumeSpecName: "scripts") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.305992 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.371291 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.371327 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.371340 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5d3aa80-0713-402e-96de-d15177f6e7b2-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.371354 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.371365 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fj7g\" (UniqueName: \"kubernetes.io/projected/c5d3aa80-0713-402e-96de-d15177f6e7b2-kube-api-access-5fj7g\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.400528 4651 generic.go:334] "Generic (PLEG): container finished" podID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerID="77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13" exitCode=0 Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.400581 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerDied","Data":"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13"} Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.400616 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5d3aa80-0713-402e-96de-d15177f6e7b2","Type":"ContainerDied","Data":"fc8f8757318b49ab27d2d4950fb2d3f84a5a916a63347e795aa92325c37060ea"} Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.400635 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.400637 4651 scope.go:117] "RemoveContainer" containerID="28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.406211 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data" (OuterVolumeSpecName: "config-data") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.413246 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5d3aa80-0713-402e-96de-d15177f6e7b2" (UID: "c5d3aa80-0713-402e-96de-d15177f6e7b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.427270 4651 scope.go:117] "RemoveContainer" containerID="65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.458143 4651 scope.go:117] "RemoveContainer" containerID="77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.473070 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.473135 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d3aa80-0713-402e-96de-d15177f6e7b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.481350 4651 scope.go:117] "RemoveContainer" containerID="189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.500593 4651 scope.go:117] "RemoveContainer" containerID="28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.501732 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc\": container with ID starting with 28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc not found: ID does not exist" containerID="28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.501915 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc"} err="failed to get container status \"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc\": rpc error: code = NotFound desc = could not find container \"28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc\": container with ID starting with 28f7de96676b987696b229547cf03872fa7d2baf1463ff20923eb62a87d02edc not found: ID does not exist" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.501956 4651 scope.go:117] "RemoveContainer" containerID="65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.502673 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264\": container with ID starting with 65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264 not found: ID does not exist" containerID="65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.502712 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264"} err="failed to get container status \"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264\": rpc error: code = NotFound desc = could not find container \"65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264\": container with ID starting with 65db45e582f740d7ff58178b0ad1a0f03606dd7c67a76e7efc23e38821f01264 not found: ID does not exist" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.502733 4651 scope.go:117] "RemoveContainer" containerID="77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.503205 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13\": container with ID starting with 77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13 not found: ID does not exist" containerID="77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.503239 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13"} err="failed to get container status \"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13\": rpc error: code = NotFound desc = could not find container \"77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13\": container with ID starting with 77a51a594b685f4c7013cbea544dee76e24f85a6a2fe2dd5f7fbfe6358b11b13 not found: ID does not exist" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.503257 4651 scope.go:117] "RemoveContainer" containerID="189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.503467 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987\": container with ID starting with 189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987 not found: ID does not exist" containerID="189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.503496 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987"} err="failed to get container status \"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987\": rpc error: code = NotFound desc = could not find container \"189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987\": container with ID starting with 189f91b6881f3af7540e70a9f7b1bc60c9fb5b1291f83f178ccc0ab6dfccd987 not found: ID does not exist" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.800996 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.809934 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825244 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.825597 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="sg-core" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825614 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="sg-core" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.825633 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-central-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825640 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-central-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.825653 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-notification-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825659 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-notification-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: E1126 15:10:09.825674 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825679 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825858 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="proxy-httpd" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825875 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-notification-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825887 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="sg-core" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.825896 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" containerName="ceilometer-central-agent" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.827912 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.835945 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.836460 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.884873 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986440 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986600 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986636 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986661 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986688 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986707 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5t26\" (UniqueName: \"kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:09 crc kubenswrapper[4651]: I1126 15:10:09.986729 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088761 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088891 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088916 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088932 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088951 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088964 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5t26\" (UniqueName: \"kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.088982 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.089579 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.089816 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.093588 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.094904 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.095448 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.102199 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.117528 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5t26\" (UniqueName: \"kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26\") pod \"ceilometer-0\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.197014 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:10 crc kubenswrapper[4651]: I1126 15:10:10.657105 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:10 crc kubenswrapper[4651]: W1126 15:10:10.661353 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda05b2a3_cd0a_43ee_b2c9_617ba633e84a.slice/crio-0b40feb516b9e2d1ed6d0ed8bedcb649593916f87058f4eab88e58e70211dba8 WatchSource:0}: Error finding container 0b40feb516b9e2d1ed6d0ed8bedcb649593916f87058f4eab88e58e70211dba8: Status 404 returned error can't find the container with id 0b40feb516b9e2d1ed6d0ed8bedcb649593916f87058f4eab88e58e70211dba8 Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.040332 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.207113 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdhzf\" (UniqueName: \"kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf\") pod \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.207178 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs\") pod \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.207397 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle\") pod \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.207449 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data\") pod \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\" (UID: \"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e\") " Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.209637 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs" (OuterVolumeSpecName: "logs") pod "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" (UID: "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.219263 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf" (OuterVolumeSpecName: "kube-api-access-mdhzf") pod "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" (UID: "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e"). InnerVolumeSpecName "kube-api-access-mdhzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.245277 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data" (OuterVolumeSpecName: "config-data") pod "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" (UID: "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.263164 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" (UID: "b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.310101 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.310146 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.310157 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdhzf\" (UniqueName: \"kubernetes.io/projected/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-kube-api-access-mdhzf\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.310167 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.434909 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d3aa80-0713-402e-96de-d15177f6e7b2" path="/var/lib/kubelet/pods/c5d3aa80-0713-402e-96de-d15177f6e7b2/volumes" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.440437 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerStarted","Data":"0b40feb516b9e2d1ed6d0ed8bedcb649593916f87058f4eab88e58e70211dba8"} Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.447744 4651 generic.go:334] "Generic (PLEG): container finished" podID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerID="748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c" exitCode=0 Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.447791 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerDied","Data":"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c"} Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.447819 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e","Type":"ContainerDied","Data":"522575712fa63433baad942fcf26ed8b807e20a419eb919a8f789b305df5e220"} Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.447836 4651 scope.go:117] "RemoveContainer" containerID="748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.447963 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.473111 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.499383 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.512719 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:11 crc kubenswrapper[4651]: E1126 15:10:11.513167 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-log" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.513187 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-log" Nov 26 15:10:11 crc kubenswrapper[4651]: E1126 15:10:11.513202 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-api" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.513208 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-api" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.513405 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-api" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.513427 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" containerName="nova-api-log" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.513455 4651 scope.go:117] "RemoveContainer" containerID="0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.514980 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517419 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517586 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517793 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517850 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjtv7\" (UniqueName: \"kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.517887 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.529371 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.531628 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.531938 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.532212 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.551017 4651 scope.go:117] "RemoveContainer" containerID="748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c" Nov 26 15:10:11 crc kubenswrapper[4651]: E1126 15:10:11.551894 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c\": container with ID starting with 748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c not found: ID does not exist" containerID="748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.551920 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c"} err="failed to get container status \"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c\": rpc error: code = NotFound desc = could not find container \"748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c\": container with ID starting with 748a064c8068a5578731a1e4a2a97a2d73b3de5b70e8a881f3adc92a0436573c not found: ID does not exist" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.551942 4651 scope.go:117] "RemoveContainer" containerID="0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1" Nov 26 15:10:11 crc kubenswrapper[4651]: E1126 15:10:11.554287 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1\": container with ID starting with 0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1 not found: ID does not exist" containerID="0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.554306 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1"} err="failed to get container status \"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1\": rpc error: code = NotFound desc = could not find container \"0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1\": container with ID starting with 0c9bee797ff19229de40cbb9dc4a1f39fec5d0e54650390ae1e13f3ab76cb1d1 not found: ID does not exist" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.629277 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.629944 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.630157 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjtv7\" (UniqueName: \"kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.630306 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.630470 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.630587 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.630326 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.637337 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.637387 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.637661 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.637975 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.651943 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjtv7\" (UniqueName: \"kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7\") pod \"nova-api-0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " pod="openstack/nova-api-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.710919 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.740635 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:11 crc kubenswrapper[4651]: I1126 15:10:11.842570 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.454752 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:12 crc kubenswrapper[4651]: W1126 15:10:12.461796 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab0c372_165a_44b2_a38d_aeccd8bb98c0.slice/crio-343e222c48e8688377778d834e17a16696cce0aea4d075de7c894f64b45767e2 WatchSource:0}: Error finding container 343e222c48e8688377778d834e17a16696cce0aea4d075de7c894f64b45767e2: Status 404 returned error can't find the container with id 343e222c48e8688377778d834e17a16696cce0aea4d075de7c894f64b45767e2 Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.463242 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerStarted","Data":"f9dee5ef0012e8c3349add1a16d62e442727d20f6ba1b3e413aae6c10419d32c"} Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.502956 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.701647 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-48lfs"] Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.719404 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.725252 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.725474 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.735558 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-48lfs"] Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.768211 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnpf\" (UniqueName: \"kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.769158 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.769810 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.770060 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.780492 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.836207 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.873496 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.873635 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.873688 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnpf\" (UniqueName: \"kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.873766 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.895722 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.897928 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.916550 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnpf\" (UniqueName: \"kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.920655 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts\") pod \"nova-cell1-cell-mapping-48lfs\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:12 crc kubenswrapper[4651]: I1126 15:10:12.974697 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.413744 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e" path="/var/lib/kubelet/pods/b8ddd2a5-e9f1-498e-9e33-8ec19b788d7e/volumes" Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.478285 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerStarted","Data":"f8dbeef8fa23ab4658d04f8f6c9c796174bff59ae3cfd5d27c63bebd4e2c7618"} Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.478901 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerStarted","Data":"2bbef4e09bce1f8ff997aea5085e5d7bc3cfa869d3364c1a87fd5b2fcbdf4ee3"} Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.479127 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerStarted","Data":"343e222c48e8688377778d834e17a16696cce0aea4d075de7c894f64b45767e2"} Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.485581 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerStarted","Data":"b50a4fee48de9858edccce0740141fc837f1f815e6ab7017dc95423b6e59ec5a"} Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.485790 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerStarted","Data":"cdcf231011b8bd4f241dd24d6be4dc174b10f5366214f0b2bcf941dc6e3c65a8"} Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.506363 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.506340668 podStartE2EDuration="2.506340668s" podCreationTimestamp="2025-11-26 15:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:13.499619373 +0000 UTC m=+1180.925366997" watchObservedRunningTime="2025-11-26 15:10:13.506340668 +0000 UTC m=+1180.932088292" Nov 26 15:10:13 crc kubenswrapper[4651]: I1126 15:10:13.548763 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-48lfs"] Nov 26 15:10:14 crc kubenswrapper[4651]: I1126 15:10:14.502670 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-48lfs" event={"ID":"41567ca0-5457-4763-a8f9-b28588b4b7b1","Type":"ContainerStarted","Data":"6170cfc0482c5eb2e8d56478263f3fd5df89a467d74a4d2cbce9f90980715d2d"} Nov 26 15:10:14 crc kubenswrapper[4651]: I1126 15:10:14.503680 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-48lfs" event={"ID":"41567ca0-5457-4763-a8f9-b28588b4b7b1","Type":"ContainerStarted","Data":"ecda89343d1aed8a4a1cb543e9b3cbd160cb460a03a135cc2f3c2373052c2389"} Nov 26 15:10:14 crc kubenswrapper[4651]: I1126 15:10:14.531682 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-48lfs" podStartSLOduration=2.531663843 podStartE2EDuration="2.531663843s" podCreationTimestamp="2025-11-26 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:14.528905707 +0000 UTC m=+1181.954653331" watchObservedRunningTime="2025-11-26 15:10:14.531663843 +0000 UTC m=+1181.957411447" Nov 26 15:10:14 crc kubenswrapper[4651]: I1126 15:10:14.916278 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:10:14 crc kubenswrapper[4651]: I1126 15:10:14.975577 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.010203 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.010677 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="dnsmasq-dns" containerID="cri-o://4a4acab3bcd2984c44e8ad06fdad13952da7964ad17a9ef4b79965e76753e9d0" gracePeriod=10 Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.521978 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f54c7c77d-rx8gm" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.544158 4651 generic.go:334] "Generic (PLEG): container finished" podID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerID="4a4acab3bcd2984c44e8ad06fdad13952da7964ad17a9ef4b79965e76753e9d0" exitCode=0 Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.544241 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" event={"ID":"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4","Type":"ContainerDied","Data":"4a4acab3bcd2984c44e8ad06fdad13952da7964ad17a9ef4b79965e76753e9d0"} Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.594928 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.595190 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon-log" containerID="cri-o://041b9e0af2f72c70708cca2245ba415e6c5829af6bb51c79a57997f41bb12658" gracePeriod=30 Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.595634 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" containerID="cri-o://e459d337cfdf21c6171a193e9e9d70d57ce29ab97edf0ea60127ef435043b603" gracePeriod=30 Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.684337 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.852359 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb\") pod \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.852851 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb\") pod \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.852946 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l57lb\" (UniqueName: \"kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb\") pod \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.853079 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config\") pod \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.853139 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc\") pod \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\" (UID: \"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4\") " Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.857181 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb" (OuterVolumeSpecName: "kube-api-access-l57lb") pod "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" (UID: "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4"). InnerVolumeSpecName "kube-api-access-l57lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.908996 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" (UID: "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.918464 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config" (OuterVolumeSpecName: "config") pod "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" (UID: "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.923713 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" (UID: "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.929559 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" (UID: "1b5bfc2d-34ea-421b-802e-c0aa0294a5d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.955809 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l57lb\" (UniqueName: \"kubernetes.io/projected/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-kube-api-access-l57lb\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.955844 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.955853 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.955861 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:15 crc kubenswrapper[4651]: I1126 15:10:15.955871 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.554052 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerStarted","Data":"7b68710ee6f1287d334bdee507f2d36cbce9d2b080b897729bcc85d7923fc31c"} Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.554685 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.555492 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" event={"ID":"1b5bfc2d-34ea-421b-802e-c0aa0294a5d4","Type":"ContainerDied","Data":"453e940b80b90eb0ca818acad9f17422383bc68c42d96fbdf44ae974543a52ca"} Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.555526 4651 scope.go:117] "RemoveContainer" containerID="4a4acab3bcd2984c44e8ad06fdad13952da7964ad17a9ef4b79965e76753e9d0" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.555577 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d99cc479-5blqc" Nov 26 15:10:16 crc kubenswrapper[4651]: E1126 15:10:16.570309 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openstack/swift-proxy-6978d54687-jsqtl" podUID="09fca043-ad27-4285-8894-522bc6cc68f4" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.577998 4651 scope.go:117] "RemoveContainer" containerID="e876c628b4ef2f370ad838ef81ff84f8987828adb52bf9265dd04beaeb25d5cd" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.616092 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.105946237 podStartE2EDuration="7.616030434s" podCreationTimestamp="2025-11-26 15:10:09 +0000 UTC" firstStartedPulling="2025-11-26 15:10:10.663783952 +0000 UTC m=+1178.089531556" lastFinishedPulling="2025-11-26 15:10:15.173868149 +0000 UTC m=+1182.599615753" observedRunningTime="2025-11-26 15:10:16.614664857 +0000 UTC m=+1184.040412481" watchObservedRunningTime="2025-11-26 15:10:16.616030434 +0000 UTC m=+1184.041778028" Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.642692 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:10:16 crc kubenswrapper[4651]: I1126 15:10:16.650812 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56d99cc479-5blqc"] Nov 26 15:10:17 crc kubenswrapper[4651]: I1126 15:10:17.412892 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" path="/var/lib/kubelet/pods/1b5bfc2d-34ea-421b-802e-c0aa0294a5d4/volumes" Nov 26 15:10:17 crc kubenswrapper[4651]: I1126 15:10:17.566327 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:10:19 crc kubenswrapper[4651]: I1126 15:10:19.586201 4651 generic.go:334] "Generic (PLEG): container finished" podID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerID="e459d337cfdf21c6171a193e9e9d70d57ce29ab97edf0ea60127ef435043b603" exitCode=0 Nov 26 15:10:19 crc kubenswrapper[4651]: I1126 15:10:19.586398 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerDied","Data":"e459d337cfdf21c6171a193e9e9d70d57ce29ab97edf0ea60127ef435043b603"} Nov 26 15:10:19 crc kubenswrapper[4651]: I1126 15:10:19.586840 4651 scope.go:117] "RemoveContainer" containerID="a9e18539050248184621d537c94cd7c6c67bed8a523b93401626fecf6ae227ef" Nov 26 15:10:19 crc kubenswrapper[4651]: I1126 15:10:19.612307 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:10:20 crc kubenswrapper[4651]: I1126 15:10:20.600261 4651 generic.go:334] "Generic (PLEG): container finished" podID="41567ca0-5457-4763-a8f9-b28588b4b7b1" containerID="6170cfc0482c5eb2e8d56478263f3fd5df89a467d74a4d2cbce9f90980715d2d" exitCode=0 Nov 26 15:10:20 crc kubenswrapper[4651]: I1126 15:10:20.600312 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-48lfs" event={"ID":"41567ca0-5457-4763-a8f9-b28588b4b7b1","Type":"ContainerDied","Data":"6170cfc0482c5eb2e8d56478263f3fd5df89a467d74a4d2cbce9f90980715d2d"} Nov 26 15:10:21 crc kubenswrapper[4651]: I1126 15:10:21.573386 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:10:21 crc kubenswrapper[4651]: E1126 15:10:21.573565 4651 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:10:21 crc kubenswrapper[4651]: E1126 15:10:21.573795 4651 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6978d54687-jsqtl: configmap "swift-ring-files" not found Nov 26 15:10:21 crc kubenswrapper[4651]: E1126 15:10:21.573853 4651 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift podName:09fca043-ad27-4285-8894-522bc6cc68f4 nodeName:}" failed. No retries permitted until 2025-11-26 15:12:23.573834812 +0000 UTC m=+1310.999582416 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift") pod "swift-proxy-6978d54687-jsqtl" (UID: "09fca043-ad27-4285-8894-522bc6cc68f4") : configmap "swift-ring-files" not found Nov 26 15:10:21 crc kubenswrapper[4651]: I1126 15:10:21.842813 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:10:21 crc kubenswrapper[4651]: I1126 15:10:21.842868 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.009489 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.186605 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts\") pod \"41567ca0-5457-4763-a8f9-b28588b4b7b1\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.186713 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnpf\" (UniqueName: \"kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf\") pod \"41567ca0-5457-4763-a8f9-b28588b4b7b1\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.186746 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle\") pod \"41567ca0-5457-4763-a8f9-b28588b4b7b1\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.186790 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data\") pod \"41567ca0-5457-4763-a8f9-b28588b4b7b1\" (UID: \"41567ca0-5457-4763-a8f9-b28588b4b7b1\") " Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.198209 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts" (OuterVolumeSpecName: "scripts") pod "41567ca0-5457-4763-a8f9-b28588b4b7b1" (UID: "41567ca0-5457-4763-a8f9-b28588b4b7b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.209772 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf" (OuterVolumeSpecName: "kube-api-access-4tnpf") pod "41567ca0-5457-4763-a8f9-b28588b4b7b1" (UID: "41567ca0-5457-4763-a8f9-b28588b4b7b1"). InnerVolumeSpecName "kube-api-access-4tnpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.227498 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41567ca0-5457-4763-a8f9-b28588b4b7b1" (UID: "41567ca0-5457-4763-a8f9-b28588b4b7b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.233492 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data" (OuterVolumeSpecName: "config-data") pod "41567ca0-5457-4763-a8f9-b28588b4b7b1" (UID: "41567ca0-5457-4763-a8f9-b28588b4b7b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.289401 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.289438 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tnpf\" (UniqueName: \"kubernetes.io/projected/41567ca0-5457-4763-a8f9-b28588b4b7b1-kube-api-access-4tnpf\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.289453 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.289463 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41567ca0-5457-4763-a8f9-b28588b4b7b1-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.623924 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-48lfs" event={"ID":"41567ca0-5457-4763-a8f9-b28588b4b7b1","Type":"ContainerDied","Data":"ecda89343d1aed8a4a1cb543e9b3cbd160cb460a03a135cc2f3c2373052c2389"} Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.623996 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-48lfs" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.624020 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecda89343d1aed8a4a1cb543e9b3cbd160cb460a03a135cc2f3c2373052c2389" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.803061 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.803524 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-log" containerID="cri-o://2bbef4e09bce1f8ff997aea5085e5d7bc3cfa869d3364c1a87fd5b2fcbdf4ee3" gracePeriod=30 Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.803664 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-api" containerID="cri-o://f8dbeef8fa23ab4658d04f8f6c9c796174bff59ae3cfd5d27c63bebd4e2c7618" gracePeriod=30 Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.811393 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": EOF" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.811388 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": EOF" Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.820176 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.820440 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerName="nova-scheduler-scheduler" containerID="cri-o://b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" gracePeriod=30 Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.892780 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.892984 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" containerID="cri-o://45ba5535542b59701406caadd2410eea4b79aae4fde3b5ba66e91d74fb60bc2b" gracePeriod=30 Nov 26 15:10:22 crc kubenswrapper[4651]: I1126 15:10:22.893382 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" containerID="cri-o://8c9fe5be740e9003884ecb7d4016fca9c33b14d93ae801df5652b5720280676e" gracePeriod=30 Nov 26 15:10:23 crc kubenswrapper[4651]: E1126 15:10:23.513890 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:10:23 crc kubenswrapper[4651]: E1126 15:10:23.516947 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:10:23 crc kubenswrapper[4651]: E1126 15:10:23.521196 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 26 15:10:23 crc kubenswrapper[4651]: E1126 15:10:23.521327 4651 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerName="nova-scheduler-scheduler" Nov 26 15:10:23 crc kubenswrapper[4651]: I1126 15:10:23.635367 4651 generic.go:334] "Generic (PLEG): container finished" podID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerID="45ba5535542b59701406caadd2410eea4b79aae4fde3b5ba66e91d74fb60bc2b" exitCode=143 Nov 26 15:10:23 crc kubenswrapper[4651]: I1126 15:10:23.635444 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerDied","Data":"45ba5535542b59701406caadd2410eea4b79aae4fde3b5ba66e91d74fb60bc2b"} Nov 26 15:10:23 crc kubenswrapper[4651]: I1126 15:10:23.637167 4651 generic.go:334] "Generic (PLEG): container finished" podID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerID="2bbef4e09bce1f8ff997aea5085e5d7bc3cfa869d3364c1a87fd5b2fcbdf4ee3" exitCode=143 Nov 26 15:10:23 crc kubenswrapper[4651]: I1126 15:10:23.637198 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerDied","Data":"2bbef4e09bce1f8ff997aea5085e5d7bc3cfa869d3364c1a87fd5b2fcbdf4ee3"} Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.205304 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:38564->10.217.0.192:8775: read: connection reset by peer" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.205384 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:38576->10.217.0.192:8775: read: connection reset by peer" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.663646 4651 generic.go:334] "Generic (PLEG): container finished" podID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerID="8c9fe5be740e9003884ecb7d4016fca9c33b14d93ae801df5652b5720280676e" exitCode=0 Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.663768 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerDied","Data":"8c9fe5be740e9003884ecb7d4016fca9c33b14d93ae801df5652b5720280676e"} Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.663930 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa55b0ac-a745-462a-a3a9-2bf9266f60a8","Type":"ContainerDied","Data":"7a5f723fe2aefe63370b51ebef592599cbe7889e927abc9f3de1257781a8b439"} Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.663945 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5f723fe2aefe63370b51ebef592599cbe7889e927abc9f3de1257781a8b439" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.688418 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.773827 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmqct\" (UniqueName: \"kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct\") pod \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.773887 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data\") pod \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.773909 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs\") pod \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.773939 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs\") pod \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.773998 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle\") pod \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\" (UID: \"fa55b0ac-a745-462a-a3a9-2bf9266f60a8\") " Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.774488 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs" (OuterVolumeSpecName: "logs") pod "fa55b0ac-a745-462a-a3a9-2bf9266f60a8" (UID: "fa55b0ac-a745-462a-a3a9-2bf9266f60a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.775235 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.798078 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct" (OuterVolumeSpecName: "kube-api-access-tmqct") pod "fa55b0ac-a745-462a-a3a9-2bf9266f60a8" (UID: "fa55b0ac-a745-462a-a3a9-2bf9266f60a8"). InnerVolumeSpecName "kube-api-access-tmqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.821014 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data" (OuterVolumeSpecName: "config-data") pod "fa55b0ac-a745-462a-a3a9-2bf9266f60a8" (UID: "fa55b0ac-a745-462a-a3a9-2bf9266f60a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.824347 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa55b0ac-a745-462a-a3a9-2bf9266f60a8" (UID: "fa55b0ac-a745-462a-a3a9-2bf9266f60a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.877801 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmqct\" (UniqueName: \"kubernetes.io/projected/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-kube-api-access-tmqct\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.877835 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.877871 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.881626 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fa55b0ac-a745-462a-a3a9-2bf9266f60a8" (UID: "fa55b0ac-a745-462a-a3a9-2bf9266f60a8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:26 crc kubenswrapper[4651]: I1126 15:10:26.979928 4651 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa55b0ac-a745-462a-a3a9-2bf9266f60a8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.670130 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.710677 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.719950 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743160 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:27 crc kubenswrapper[4651]: E1126 15:10:27.743627 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="init" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743650 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="init" Nov 26 15:10:27 crc kubenswrapper[4651]: E1126 15:10:27.743672 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743680 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" Nov 26 15:10:27 crc kubenswrapper[4651]: E1126 15:10:27.743688 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41567ca0-5457-4763-a8f9-b28588b4b7b1" containerName="nova-manage" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743696 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="41567ca0-5457-4763-a8f9-b28588b4b7b1" containerName="nova-manage" Nov 26 15:10:27 crc kubenswrapper[4651]: E1126 15:10:27.743709 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="dnsmasq-dns" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743717 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="dnsmasq-dns" Nov 26 15:10:27 crc kubenswrapper[4651]: E1126 15:10:27.743727 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743734 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743967 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-metadata" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.743989 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5bfc2d-34ea-421b-802e-c0aa0294a5d4" containerName="dnsmasq-dns" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.744002 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="41567ca0-5457-4763-a8f9-b28588b4b7b1" containerName="nova-manage" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.744021 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" containerName="nova-metadata-log" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.745241 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.749833 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.750853 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.761205 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.900297 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-config-data\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.900347 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.900384 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.900891 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cdc146-583a-403a-9758-de42eea152de-logs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:27 crc kubenswrapper[4651]: I1126 15:10:27.900984 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2kp\" (UniqueName: \"kubernetes.io/projected/71cdc146-583a-403a-9758-de42eea152de-kube-api-access-nb2kp\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.003715 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cdc146-583a-403a-9758-de42eea152de-logs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.004229 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cdc146-583a-403a-9758-de42eea152de-logs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.003860 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2kp\" (UniqueName: \"kubernetes.io/projected/71cdc146-583a-403a-9758-de42eea152de-kube-api-access-nb2kp\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.004602 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-config-data\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.005724 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.005776 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.010962 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-config-data\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.011610 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.019142 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cdc146-583a-403a-9758-de42eea152de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.019670 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2kp\" (UniqueName: \"kubernetes.io/projected/71cdc146-583a-403a-9758-de42eea152de-kube-api-access-nb2kp\") pod \"nova-metadata-0\" (UID: \"71cdc146-583a-403a-9758-de42eea152de\") " pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.062337 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.246439 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.413931 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxg9f\" (UniqueName: \"kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f\") pod \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.414097 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data\") pod \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.414227 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle\") pod \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\" (UID: \"4e94ac87-b21e-4f95-98ac-d97c604aaa30\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.429008 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f" (OuterVolumeSpecName: "kube-api-access-bxg9f") pod "4e94ac87-b21e-4f95-98ac-d97c604aaa30" (UID: "4e94ac87-b21e-4f95-98ac-d97c604aaa30"). InnerVolumeSpecName "kube-api-access-bxg9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.445521 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data" (OuterVolumeSpecName: "config-data") pod "4e94ac87-b21e-4f95-98ac-d97c604aaa30" (UID: "4e94ac87-b21e-4f95-98ac-d97c604aaa30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.447811 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e94ac87-b21e-4f95-98ac-d97c604aaa30" (UID: "4e94ac87-b21e-4f95-98ac-d97c604aaa30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.519471 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.519499 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e94ac87-b21e-4f95-98ac-d97c604aaa30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.519510 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxg9f\" (UniqueName: \"kubernetes.io/projected/4e94ac87-b21e-4f95-98ac-d97c604aaa30-kube-api-access-bxg9f\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.575510 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 26 15:10:28 crc kubenswrapper[4651]: W1126 15:10:28.591130 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71cdc146_583a_403a_9758_de42eea152de.slice/crio-c949f4af23803317fc539f024507f1389b3e1cfff1f6ed99b49286bdd899d359 WatchSource:0}: Error finding container c949f4af23803317fc539f024507f1389b3e1cfff1f6ed99b49286bdd899d359: Status 404 returned error can't find the container with id c949f4af23803317fc539f024507f1389b3e1cfff1f6ed99b49286bdd899d359 Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.690004 4651 generic.go:334] "Generic (PLEG): container finished" podID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" exitCode=0 Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.690349 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e94ac87-b21e-4f95-98ac-d97c604aaa30","Type":"ContainerDied","Data":"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491"} Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.690376 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e94ac87-b21e-4f95-98ac-d97c604aaa30","Type":"ContainerDied","Data":"e9ee5635a45644b4012fedbeb38fc9dcef6e90326947fe10ac1f5fc74d170d58"} Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.690392 4651 scope.go:117] "RemoveContainer" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.690509 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.698602 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71cdc146-583a-403a-9758-de42eea152de","Type":"ContainerStarted","Data":"c949f4af23803317fc539f024507f1389b3e1cfff1f6ed99b49286bdd899d359"} Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.700885 4651 generic.go:334] "Generic (PLEG): container finished" podID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerID="f8dbeef8fa23ab4658d04f8f6c9c796174bff59ae3cfd5d27c63bebd4e2c7618" exitCode=0 Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.700928 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerDied","Data":"f8dbeef8fa23ab4658d04f8f6c9c796174bff59ae3cfd5d27c63bebd4e2c7618"} Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.737194 4651 scope.go:117] "RemoveContainer" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" Nov 26 15:10:28 crc kubenswrapper[4651]: E1126 15:10:28.739404 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491\": container with ID starting with b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491 not found: ID does not exist" containerID="b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.740171 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491"} err="failed to get container status \"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491\": rpc error: code = NotFound desc = could not find container \"b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491\": container with ID starting with b14c2d5a32be7b3fca3907ca06dddc71965bd33ecf841c003aac8ab12981b491 not found: ID does not exist" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.782675 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.799012 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.809706 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.820888 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:28 crc kubenswrapper[4651]: E1126 15:10:28.821483 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-api" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.821555 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-api" Nov 26 15:10:28 crc kubenswrapper[4651]: E1126 15:10:28.821617 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerName="nova-scheduler-scheduler" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.821683 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerName="nova-scheduler-scheduler" Nov 26 15:10:28 crc kubenswrapper[4651]: E1126 15:10:28.821741 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-log" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.821791 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-log" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.822030 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" containerName="nova-scheduler-scheduler" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.822139 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-log" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.822219 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" containerName="nova-api-api" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.822917 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.825299 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.893475 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.930699 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.930817 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.930863 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.930925 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.930969 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.931028 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjtv7\" (UniqueName: \"kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7\") pod \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\" (UID: \"5ab0c372-165a-44b2-a38d-aeccd8bb98c0\") " Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.931387 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzvc\" (UniqueName: \"kubernetes.io/projected/7fd0076a-46e8-4776-845d-f69a0679c989-kube-api-access-6lzvc\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.931506 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-config-data\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.931562 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.933371 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs" (OuterVolumeSpecName: "logs") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.939195 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7" (OuterVolumeSpecName: "kube-api-access-rjtv7") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "kube-api-access-rjtv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.961554 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data" (OuterVolumeSpecName: "config-data") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.972341 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.984895 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:28 crc kubenswrapper[4651]: I1126 15:10:28.989827 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ab0c372-165a-44b2-a38d-aeccd8bb98c0" (UID: "5ab0c372-165a-44b2-a38d-aeccd8bb98c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.032849 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzvc\" (UniqueName: \"kubernetes.io/projected/7fd0076a-46e8-4776-845d-f69a0679c989-kube-api-access-6lzvc\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033078 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-config-data\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033197 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033311 4651 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033373 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033429 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033485 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033551 4651 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.033612 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjtv7\" (UniqueName: \"kubernetes.io/projected/5ab0c372-165a-44b2-a38d-aeccd8bb98c0-kube-api-access-rjtv7\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.038470 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-config-data\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.041258 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd0076a-46e8-4776-845d-f69a0679c989-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.053582 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzvc\" (UniqueName: \"kubernetes.io/projected/7fd0076a-46e8-4776-845d-f69a0679c989-kube-api-access-6lzvc\") pod \"nova-scheduler-0\" (UID: \"7fd0076a-46e8-4776-845d-f69a0679c989\") " pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.142272 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.418237 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e94ac87-b21e-4f95-98ac-d97c604aaa30" path="/var/lib/kubelet/pods/4e94ac87-b21e-4f95-98ac-d97c604aaa30/volumes" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.422020 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa55b0ac-a745-462a-a3a9-2bf9266f60a8" path="/var/lib/kubelet/pods/fa55b0ac-a745-462a-a3a9-2bf9266f60a8/volumes" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.585186 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.613146 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.714159 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71cdc146-583a-403a-9758-de42eea152de","Type":"ContainerStarted","Data":"ee29d1f0e378b30097e30db04bb6e243f7b59520bafbf08a322e1c5cd84426e5"} Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.714210 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71cdc146-583a-403a-9758-de42eea152de","Type":"ContainerStarted","Data":"4895a5a9261a1e6db001451994a84219801fd4eacb7cc903dff68daf33f7c053"} Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.726180 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ab0c372-165a-44b2-a38d-aeccd8bb98c0","Type":"ContainerDied","Data":"343e222c48e8688377778d834e17a16696cce0aea4d075de7c894f64b45767e2"} Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.726231 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.726275 4651 scope.go:117] "RemoveContainer" containerID="f8dbeef8fa23ab4658d04f8f6c9c796174bff59ae3cfd5d27c63bebd4e2c7618" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.742347 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fd0076a-46e8-4776-845d-f69a0679c989","Type":"ContainerStarted","Data":"a0c685a201b223ec1f0187a82919a00313e04c47cd381f6b7bfd3dd908c04c76"} Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.747320 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.747290011 podStartE2EDuration="2.747290011s" podCreationTimestamp="2025-11-26 15:10:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:29.730709253 +0000 UTC m=+1197.156456877" watchObservedRunningTime="2025-11-26 15:10:29.747290011 +0000 UTC m=+1197.173037625" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.773012 4651 scope.go:117] "RemoveContainer" containerID="2bbef4e09bce1f8ff997aea5085e5d7bc3cfa869d3364c1a87fd5b2fcbdf4ee3" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.777975 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.799102 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.816357 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.817977 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.822185 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.822525 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.822725 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.826509 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952277 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-public-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952344 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952455 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq68s\" (UniqueName: \"kubernetes.io/projected/c4174859-235d-4a45-ba34-178b888f3513-kube-api-access-wq68s\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952485 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4174859-235d-4a45-ba34-178b888f3513-logs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952531 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-config-data\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:29 crc kubenswrapper[4651]: I1126 15:10:29.952579 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054007 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-public-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054112 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054192 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq68s\" (UniqueName: \"kubernetes.io/projected/c4174859-235d-4a45-ba34-178b888f3513-kube-api-access-wq68s\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054214 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4174859-235d-4a45-ba34-178b888f3513-logs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054243 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-config-data\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054277 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.054913 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4174859-235d-4a45-ba34-178b888f3513-logs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.057695 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-config-data\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.058803 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-public-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.060466 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.071216 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4174859-235d-4a45-ba34-178b888f3513-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.073356 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq68s\" (UniqueName: \"kubernetes.io/projected/c4174859-235d-4a45-ba34-178b888f3513-kube-api-access-wq68s\") pod \"nova-api-0\" (UID: \"c4174859-235d-4a45-ba34-178b888f3513\") " pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.183959 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.624450 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.754495 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fd0076a-46e8-4776-845d-f69a0679c989","Type":"ContainerStarted","Data":"2b7c6356b2ace597c212459f64f41ab18c5a53861d81e6d777e19bb449d0ff50"} Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.773720 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4174859-235d-4a45-ba34-178b888f3513","Type":"ContainerStarted","Data":"4cf655b5362806d35c5f3869f086f9eaa54b49b200679dbd89d870c46121e952"} Nov 26 15:10:30 crc kubenswrapper[4651]: I1126 15:10:30.780598 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.780577675 podStartE2EDuration="2.780577675s" podCreationTimestamp="2025-11-26 15:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:30.778294452 +0000 UTC m=+1198.204042086" watchObservedRunningTime="2025-11-26 15:10:30.780577675 +0000 UTC m=+1198.206325279" Nov 26 15:10:31 crc kubenswrapper[4651]: I1126 15:10:31.417996 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab0c372-165a-44b2-a38d-aeccd8bb98c0" path="/var/lib/kubelet/pods/5ab0c372-165a-44b2-a38d-aeccd8bb98c0/volumes" Nov 26 15:10:31 crc kubenswrapper[4651]: I1126 15:10:31.783463 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4174859-235d-4a45-ba34-178b888f3513","Type":"ContainerStarted","Data":"e55ebe1cd16b29af8a1a6c289ab5c1771211e44f8021808f82a2c396c0733fa2"} Nov 26 15:10:31 crc kubenswrapper[4651]: I1126 15:10:31.783498 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4174859-235d-4a45-ba34-178b888f3513","Type":"ContainerStarted","Data":"9fe0c15fa558172d17317f79dd27e02b44b0014367e7ed3b5bd4d3bf5f850b3c"} Nov 26 15:10:31 crc kubenswrapper[4651]: I1126 15:10:31.805728 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.805696404 podStartE2EDuration="2.805696404s" podCreationTimestamp="2025-11-26 15:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:31.800250154 +0000 UTC m=+1199.225997768" watchObservedRunningTime="2025-11-26 15:10:31.805696404 +0000 UTC m=+1199.231443998" Nov 26 15:10:33 crc kubenswrapper[4651]: I1126 15:10:33.062730 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:10:33 crc kubenswrapper[4651]: I1126 15:10:33.063017 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 26 15:10:34 crc kubenswrapper[4651]: I1126 15:10:34.142336 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 26 15:10:38 crc kubenswrapper[4651]: I1126 15:10:38.063659 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:10:38 crc kubenswrapper[4651]: I1126 15:10:38.064211 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.074210 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71cdc146-583a-403a-9758-de42eea152de" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.074208 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71cdc146-583a-403a-9758-de42eea152de" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.142384 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.174776 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.613109 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6974b49b94-vzn8h" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.613247 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:10:39 crc kubenswrapper[4651]: I1126 15:10:39.907266 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 26 15:10:40 crc kubenswrapper[4651]: I1126 15:10:40.185462 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:10:40 crc kubenswrapper[4651]: I1126 15:10:40.185507 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 26 15:10:40 crc kubenswrapper[4651]: I1126 15:10:40.213832 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 15:10:41 crc kubenswrapper[4651]: I1126 15:10:41.198216 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4174859-235d-4a45-ba34-178b888f3513" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:10:41 crc kubenswrapper[4651]: I1126 15:10:41.198260 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4174859-235d-4a45-ba34-178b888f3513" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.027823 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.028339 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" containerName="kube-state-metrics" containerID="cri-o://5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559" gracePeriod=30 Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.485916 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.668085 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c\") pod \"5e9fabd8-6c8f-4ff7-960c-29c3105073b5\" (UID: \"5e9fabd8-6c8f-4ff7-960c-29c3105073b5\") " Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.673813 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c" (OuterVolumeSpecName: "kube-api-access-h2d8c") pod "5e9fabd8-6c8f-4ff7-960c-29c3105073b5" (UID: "5e9fabd8-6c8f-4ff7-960c-29c3105073b5"). InnerVolumeSpecName "kube-api-access-h2d8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.772417 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2d8c\" (UniqueName: \"kubernetes.io/projected/5e9fabd8-6c8f-4ff7-960c-29c3105073b5-kube-api-access-h2d8c\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.941175 4651 generic.go:334] "Generic (PLEG): container finished" podID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" containerID="5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559" exitCode=2 Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.941223 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e9fabd8-6c8f-4ff7-960c-29c3105073b5","Type":"ContainerDied","Data":"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559"} Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.941249 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e9fabd8-6c8f-4ff7-960c-29c3105073b5","Type":"ContainerDied","Data":"6523803621b129fb26d0c52b4b33d0bd076170d7103200f45f7936687e0b0065"} Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.941265 4651 scope.go:117] "RemoveContainer" containerID="5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.941772 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.982203 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.995271 4651 scope.go:117] "RemoveContainer" containerID="5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559" Nov 26 15:10:44 crc kubenswrapper[4651]: E1126 15:10:44.996889 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559\": container with ID starting with 5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559 not found: ID does not exist" containerID="5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559" Nov 26 15:10:44 crc kubenswrapper[4651]: I1126 15:10:44.997063 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559"} err="failed to get container status \"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559\": rpc error: code = NotFound desc = could not find container \"5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559\": container with ID starting with 5c46607a378e9d745bd7c523cd29f9c84d1651a54411415d95e639c455987559 not found: ID does not exist" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.006614 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.027286 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:45 crc kubenswrapper[4651]: E1126 15:10:45.027677 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" containerName="kube-state-metrics" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.027688 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" containerName="kube-state-metrics" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.027886 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" containerName="kube-state-metrics" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.028661 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.031978 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.032155 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.039198 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.183100 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.183230 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6wc\" (UniqueName: \"kubernetes.io/projected/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-api-access-jc6wc\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.183285 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.183351 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.285323 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.285418 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6wc\" (UniqueName: \"kubernetes.io/projected/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-api-access-jc6wc\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.285457 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.285503 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.290006 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.290594 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.291873 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.313869 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6wc\" (UniqueName: \"kubernetes.io/projected/8c2d03fc-6edd-4654-8116-99aae88e3fab-kube-api-access-jc6wc\") pod \"kube-state-metrics-0\" (UID: \"8c2d03fc-6edd-4654-8116-99aae88e3fab\") " pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.354439 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.413555 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9fabd8-6c8f-4ff7-960c-29c3105073b5" path="/var/lib/kubelet/pods/5e9fabd8-6c8f-4ff7-960c-29c3105073b5/volumes" Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.850491 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.952014 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerStarted","Data":"7b2094a322916d743fcfd1a26c1eb6a9b75e1f13dee3b17f3f3d8129eeaf7b0a"} Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.954903 4651 generic.go:334] "Generic (PLEG): container finished" podID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerID="041b9e0af2f72c70708cca2245ba415e6c5829af6bb51c79a57997f41bb12658" exitCode=137 Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.954947 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerDied","Data":"041b9e0af2f72c70708cca2245ba415e6c5829af6bb51c79a57997f41bb12658"} Nov 26 15:10:45 crc kubenswrapper[4651]: I1126 15:10:45.959243 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.112301 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.112571 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-central-agent" containerID="cri-o://f9dee5ef0012e8c3349add1a16d62e442727d20f6ba1b3e413aae6c10419d32c" gracePeriod=30 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.112652 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="proxy-httpd" containerID="cri-o://7b68710ee6f1287d334bdee507f2d36cbce9d2b080b897729bcc85d7923fc31c" gracePeriod=30 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.112689 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-notification-agent" containerID="cri-o://cdcf231011b8bd4f241dd24d6be4dc174b10f5366214f0b2bcf941dc6e3c65a8" gracePeriod=30 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.112895 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="sg-core" containerID="cri-o://b50a4fee48de9858edccce0740141fc837f1f815e6ab7017dc95423b6e59ec5a" gracePeriod=30 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114232 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114281 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114320 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114338 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114356 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114383 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4998\" (UniqueName: \"kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.114409 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs\") pod \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\" (UID: \"97c5789f-f8f7-4780-8c73-e34bc5bb4f56\") " Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.115248 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs" (OuterVolumeSpecName: "logs") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.126938 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.129157 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998" (OuterVolumeSpecName: "kube-api-access-q4998") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "kube-api-access-q4998". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.148849 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts" (OuterVolumeSpecName: "scripts") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.164235 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data" (OuterVolumeSpecName: "config-data") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.166018 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.186254 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "97c5789f-f8f7-4780-8c73-e34bc5bb4f56" (UID: "97c5789f-f8f7-4780-8c73-e34bc5bb4f56"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217392 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217441 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217457 4651 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217469 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217481 4651 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217493 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4998\" (UniqueName: \"kubernetes.io/projected/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-kube-api-access-q4998\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.217507 4651 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97c5789f-f8f7-4780-8c73-e34bc5bb4f56-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.964003 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6974b49b94-vzn8h" event={"ID":"97c5789f-f8f7-4780-8c73-e34bc5bb4f56","Type":"ContainerDied","Data":"50a170759d63aa054145b0a8a35120c8a5e7af6d825d82402ece71dc3ed54d13"} Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.964407 4651 scope.go:117] "RemoveContainer" containerID="e459d337cfdf21c6171a193e9e9d70d57ce29ab97edf0ea60127ef435043b603" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.964559 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6974b49b94-vzn8h" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.969130 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerStarted","Data":"4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e"} Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.969599 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971767 4651 generic.go:334] "Generic (PLEG): container finished" podID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerID="7b68710ee6f1287d334bdee507f2d36cbce9d2b080b897729bcc85d7923fc31c" exitCode=0 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971800 4651 generic.go:334] "Generic (PLEG): container finished" podID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerID="b50a4fee48de9858edccce0740141fc837f1f815e6ab7017dc95423b6e59ec5a" exitCode=2 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971813 4651 generic.go:334] "Generic (PLEG): container finished" podID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerID="f9dee5ef0012e8c3349add1a16d62e442727d20f6ba1b3e413aae6c10419d32c" exitCode=0 Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971833 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerDied","Data":"7b68710ee6f1287d334bdee507f2d36cbce9d2b080b897729bcc85d7923fc31c"} Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971856 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerDied","Data":"b50a4fee48de9858edccce0740141fc837f1f815e6ab7017dc95423b6e59ec5a"} Nov 26 15:10:46 crc kubenswrapper[4651]: I1126 15:10:46.971866 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerDied","Data":"f9dee5ef0012e8c3349add1a16d62e442727d20f6ba1b3e413aae6c10419d32c"} Nov 26 15:10:47 crc kubenswrapper[4651]: I1126 15:10:47.014515 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.619330891 podStartE2EDuration="3.014495452s" podCreationTimestamp="2025-11-26 15:10:44 +0000 UTC" firstStartedPulling="2025-11-26 15:10:45.865497136 +0000 UTC m=+1213.291244740" lastFinishedPulling="2025-11-26 15:10:46.260661697 +0000 UTC m=+1213.686409301" observedRunningTime="2025-11-26 15:10:46.991977371 +0000 UTC m=+1214.417724985" watchObservedRunningTime="2025-11-26 15:10:47.014495452 +0000 UTC m=+1214.440243056" Nov 26 15:10:47 crc kubenswrapper[4651]: I1126 15:10:47.021197 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:10:47 crc kubenswrapper[4651]: I1126 15:10:47.039100 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6974b49b94-vzn8h"] Nov 26 15:10:47 crc kubenswrapper[4651]: I1126 15:10:47.158911 4651 scope.go:117] "RemoveContainer" containerID="041b9e0af2f72c70708cca2245ba415e6c5829af6bb51c79a57997f41bb12658" Nov 26 15:10:47 crc kubenswrapper[4651]: I1126 15:10:47.413779 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" path="/var/lib/kubelet/pods/97c5789f-f8f7-4780-8c73-e34bc5bb4f56/volumes" Nov 26 15:10:48 crc kubenswrapper[4651]: I1126 15:10:48.097403 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:10:48 crc kubenswrapper[4651]: I1126 15:10:48.348746 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:10:48 crc kubenswrapper[4651]: I1126 15:10:48.349182 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 26 15:10:48 crc kubenswrapper[4651]: I1126 15:10:48.990388 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 26 15:10:50 crc kubenswrapper[4651]: I1126 15:10:50.193565 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:10:50 crc kubenswrapper[4651]: I1126 15:10:50.194164 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 15:10:50 crc kubenswrapper[4651]: I1126 15:10:50.196463 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 26 15:10:50 crc kubenswrapper[4651]: I1126 15:10:50.200108 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.002842 4651 generic.go:334] "Generic (PLEG): container finished" podID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerID="cdcf231011b8bd4f241dd24d6be4dc174b10f5366214f0b2bcf941dc6e3c65a8" exitCode=0 Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.003027 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerDied","Data":"cdcf231011b8bd4f241dd24d6be4dc174b10f5366214f0b2bcf941dc6e3c65a8"} Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.005259 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.020223 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.155686 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322252 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322333 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322426 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322449 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5t26\" (UniqueName: \"kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322470 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322512 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.322555 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd\") pod \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\" (UID: \"da05b2a3-cd0a-43ee-b2c9-617ba633e84a\") " Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.323338 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.325453 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.328782 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26" (OuterVolumeSpecName: "kube-api-access-x5t26") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "kube-api-access-x5t26". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.328903 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts" (OuterVolumeSpecName: "scripts") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.367240 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.424938 4651 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.425612 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.425711 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5t26\" (UniqueName: \"kubernetes.io/projected/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-kube-api-access-x5t26\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.425508 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.425801 4651 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.425943 4651 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.426069 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data" (OuterVolumeSpecName: "config-data") pod "da05b2a3-cd0a-43ee-b2c9-617ba633e84a" (UID: "da05b2a3-cd0a-43ee-b2c9-617ba633e84a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.528283 4651 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:51 crc kubenswrapper[4651]: I1126 15:10:51.528317 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b2a3-cd0a-43ee-b2c9-617ba633e84a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.016830 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da05b2a3-cd0a-43ee-b2c9-617ba633e84a","Type":"ContainerDied","Data":"0b40feb516b9e2d1ed6d0ed8bedcb649593916f87058f4eab88e58e70211dba8"} Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.016899 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.016929 4651 scope.go:117] "RemoveContainer" containerID="7b68710ee6f1287d334bdee507f2d36cbce9d2b080b897729bcc85d7923fc31c" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.035745 4651 scope.go:117] "RemoveContainer" containerID="b50a4fee48de9858edccce0740141fc837f1f815e6ab7017dc95423b6e59ec5a" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.081135 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.083761 4651 scope.go:117] "RemoveContainer" containerID="cdcf231011b8bd4f241dd24d6be4dc174b10f5366214f0b2bcf941dc6e3c65a8" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.087322 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098047 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098429 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="proxy-httpd" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098452 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="proxy-httpd" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098468 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="sg-core" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098474 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="sg-core" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098481 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098488 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098499 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-central-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098505 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-central-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098513 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098518 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098525 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon-log" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098531 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon-log" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098542 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-notification-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098548 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-notification-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098714 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon-log" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098728 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-notification-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098737 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098748 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098757 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098767 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="sg-core" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098780 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="proxy-httpd" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098787 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" containerName="ceilometer-central-agent" Nov 26 15:10:52 crc kubenswrapper[4651]: E1126 15:10:52.098969 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.098983 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c5789f-f8f7-4780-8c73-e34bc5bb4f56" containerName="horizon" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.100512 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.103277 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.103568 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.103831 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.122216 4651 scope.go:117] "RemoveContainer" containerID="f9dee5ef0012e8c3349add1a16d62e442727d20f6ba1b3e413aae6c10419d32c" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.126107 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242410 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6cn\" (UniqueName: \"kubernetes.io/projected/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-kube-api-access-wg6cn\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242510 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242607 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242639 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242661 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-scripts\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242678 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-config-data\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242702 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.242761 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345125 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6cn\" (UniqueName: \"kubernetes.io/projected/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-kube-api-access-wg6cn\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345388 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345658 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345728 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345759 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-scripts\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345786 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-config-data\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.345831 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.346223 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-log-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.346566 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.347305 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-run-httpd\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.350092 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.361786 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.362291 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.363093 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-scripts\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.364176 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-config-data\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.367379 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6cn\" (UniqueName: \"kubernetes.io/projected/b7f683f0-e63a-41d1-9c75-adc0175d9c9c-kube-api-access-wg6cn\") pod \"ceilometer-0\" (UID: \"b7f683f0-e63a-41d1-9c75-adc0175d9c9c\") " pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.436387 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 26 15:10:52 crc kubenswrapper[4651]: W1126 15:10:52.956756 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f683f0_e63a_41d1_9c75_adc0175d9c9c.slice/crio-31c1c184fc5b9e7709f4e4bf390e8d6bf61683953370ae0a5194d1e4aa2ef7c5 WatchSource:0}: Error finding container 31c1c184fc5b9e7709f4e4bf390e8d6bf61683953370ae0a5194d1e4aa2ef7c5: Status 404 returned error can't find the container with id 31c1c184fc5b9e7709f4e4bf390e8d6bf61683953370ae0a5194d1e4aa2ef7c5 Nov 26 15:10:52 crc kubenswrapper[4651]: I1126 15:10:52.956821 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 26 15:10:53 crc kubenswrapper[4651]: I1126 15:10:53.027376 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f683f0-e63a-41d1-9c75-adc0175d9c9c","Type":"ContainerStarted","Data":"31c1c184fc5b9e7709f4e4bf390e8d6bf61683953370ae0a5194d1e4aa2ef7c5"} Nov 26 15:10:53 crc kubenswrapper[4651]: I1126 15:10:53.417327 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da05b2a3-cd0a-43ee-b2c9-617ba633e84a" path="/var/lib/kubelet/pods/da05b2a3-cd0a-43ee-b2c9-617ba633e84a/volumes" Nov 26 15:10:54 crc kubenswrapper[4651]: I1126 15:10:54.056550 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f683f0-e63a-41d1-9c75-adc0175d9c9c","Type":"ContainerStarted","Data":"58b91316fb6a14c4bd104a5d7a22ec5c151a4189bd76479fec86a7117095efb8"} Nov 26 15:10:55 crc kubenswrapper[4651]: I1126 15:10:55.070443 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f683f0-e63a-41d1-9c75-adc0175d9c9c","Type":"ContainerStarted","Data":"f8343f094bb0fff5556f0f45dc999e0a438f4748b0a259cc579c515ba37b9d4c"} Nov 26 15:10:55 crc kubenswrapper[4651]: I1126 15:10:55.376603 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 15:10:56 crc kubenswrapper[4651]: I1126 15:10:56.082806 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f683f0-e63a-41d1-9c75-adc0175d9c9c","Type":"ContainerStarted","Data":"16576b70f5cd4d2e20622338f18c9ac8cce2cfc2c6e0867aa9faf5282c6710cf"} Nov 26 15:10:58 crc kubenswrapper[4651]: I1126 15:10:58.105922 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7f683f0-e63a-41d1-9c75-adc0175d9c9c","Type":"ContainerStarted","Data":"d397d71d4f8ed554147a1b79753a441f2c1ec2ea235b073b231e437002d412fe"} Nov 26 15:10:58 crc kubenswrapper[4651]: I1126 15:10:58.107983 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 26 15:10:58 crc kubenswrapper[4651]: I1126 15:10:58.146824 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.094878396 podStartE2EDuration="6.146569808s" podCreationTimestamp="2025-11-26 15:10:52 +0000 UTC" firstStartedPulling="2025-11-26 15:10:52.960492122 +0000 UTC m=+1220.386239726" lastFinishedPulling="2025-11-26 15:10:57.012183534 +0000 UTC m=+1224.437931138" observedRunningTime="2025-11-26 15:10:58.140552482 +0000 UTC m=+1225.566300106" watchObservedRunningTime="2025-11-26 15:10:58.146569808 +0000 UTC m=+1225.572317412" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.075772 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-b5b9c"] Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.076931 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.083625 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.084437 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.109475 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-b5b9c"] Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183248 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183777 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6j4v\" (UniqueName: \"kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183880 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183898 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183938 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.183984 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.184067 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285699 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285775 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285823 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285854 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6j4v\" (UniqueName: \"kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285916 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285935 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.285966 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.287665 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.295421 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.295513 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.295519 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.295710 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.322305 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.325886 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6j4v\" (UniqueName: \"kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v\") pod \"swift-ring-rebalance-b5b9c\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.402335 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d4kwd" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.410736 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:10:59 crc kubenswrapper[4651]: I1126 15:10:59.888264 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-b5b9c"] Nov 26 15:11:00 crc kubenswrapper[4651]: I1126 15:11:00.125247 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b5b9c" event={"ID":"7035c6b3-8bd2-4447-9a56-bee3af6dceae","Type":"ContainerStarted","Data":"28c6c303ad17a2a6bd4ef9211f4528aef00d540bc58d4dc30a6f188e80e38021"} Nov 26 15:11:05 crc kubenswrapper[4651]: I1126 15:11:05.184180 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b5b9c" event={"ID":"7035c6b3-8bd2-4447-9a56-bee3af6dceae","Type":"ContainerStarted","Data":"d350c716c6802a67ec8b4bbe781601fe4112d9198a17560e4f2840692c767d75"} Nov 26 15:11:05 crc kubenswrapper[4651]: I1126 15:11:05.201212 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-b5b9c" podStartSLOduration=1.698554948 podStartE2EDuration="6.201190249s" podCreationTimestamp="2025-11-26 15:10:59 +0000 UTC" firstStartedPulling="2025-11-26 15:10:59.895980319 +0000 UTC m=+1227.321727923" lastFinishedPulling="2025-11-26 15:11:04.39861563 +0000 UTC m=+1231.824363224" observedRunningTime="2025-11-26 15:11:05.20031747 +0000 UTC m=+1232.626065094" watchObservedRunningTime="2025-11-26 15:11:05.201190249 +0000 UTC m=+1232.626937853" Nov 26 15:11:12 crc kubenswrapper[4651]: I1126 15:11:12.246583 4651 generic.go:334] "Generic (PLEG): container finished" podID="7035c6b3-8bd2-4447-9a56-bee3af6dceae" containerID="d350c716c6802a67ec8b4bbe781601fe4112d9198a17560e4f2840692c767d75" exitCode=0 Nov 26 15:11:12 crc kubenswrapper[4651]: I1126 15:11:12.246698 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b5b9c" event={"ID":"7035c6b3-8bd2-4447-9a56-bee3af6dceae","Type":"ContainerDied","Data":"d350c716c6802a67ec8b4bbe781601fe4112d9198a17560e4f2840692c767d75"} Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.585386 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671476 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671547 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671595 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671628 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671677 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6j4v\" (UniqueName: \"kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671727 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.671749 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf\") pod \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\" (UID: \"7035c6b3-8bd2-4447-9a56-bee3af6dceae\") " Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.672958 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.673636 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.692651 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v" (OuterVolumeSpecName: "kube-api-access-f6j4v") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "kube-api-access-f6j4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.695358 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts" (OuterVolumeSpecName: "scripts") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.697904 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.698170 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.705158 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7035c6b3-8bd2-4447-9a56-bee3af6dceae" (UID: "7035c6b3-8bd2-4447-9a56-bee3af6dceae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.774221 4651 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775143 4651 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775198 4651 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775213 4651 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7035c6b3-8bd2-4447-9a56-bee3af6dceae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775225 4651 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7035c6b3-8bd2-4447-9a56-bee3af6dceae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775238 4651 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7035c6b3-8bd2-4447-9a56-bee3af6dceae-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:13 crc kubenswrapper[4651]: I1126 15:11:13.775250 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6j4v\" (UniqueName: \"kubernetes.io/projected/7035c6b3-8bd2-4447-9a56-bee3af6dceae-kube-api-access-f6j4v\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:14 crc kubenswrapper[4651]: I1126 15:11:14.268488 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b5b9c" event={"ID":"7035c6b3-8bd2-4447-9a56-bee3af6dceae","Type":"ContainerDied","Data":"28c6c303ad17a2a6bd4ef9211f4528aef00d540bc58d4dc30a6f188e80e38021"} Nov 26 15:11:14 crc kubenswrapper[4651]: I1126 15:11:14.268783 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c6c303ad17a2a6bd4ef9211f4528aef00d540bc58d4dc30a6f188e80e38021" Nov 26 15:11:14 crc kubenswrapper[4651]: I1126 15:11:14.268720 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b5b9c" Nov 26 15:11:22 crc kubenswrapper[4651]: I1126 15:11:22.456245 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 26 15:11:29 crc kubenswrapper[4651]: I1126 15:11:29.132431 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:11:29 crc kubenswrapper[4651]: I1126 15:11:29.132846 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:11:50 crc kubenswrapper[4651]: E1126 15:11:50.176075 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openstack/swift-storage-0" podUID="a3b8c2db-ce7f-48ce-9fd1-d55b5583773e" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.638873 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.956285 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:11:50 crc kubenswrapper[4651]: E1126 15:11:50.956885 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7035c6b3-8bd2-4447-9a56-bee3af6dceae" containerName="swift-ring-rebalance" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.956918 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="7035c6b3-8bd2-4447-9a56-bee3af6dceae" containerName="swift-ring-rebalance" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.957302 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="7035c6b3-8bd2-4447-9a56-bee3af6dceae" containerName="swift-ring-rebalance" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.958364 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.965079 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.965451 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.977190 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.981122 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:50 crc kubenswrapper[4651]: I1126 15:11:50.981208 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.082795 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.082919 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.083131 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.108775 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.285802 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:51 crc kubenswrapper[4651]: I1126 15:11:51.729266 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:11:52 crc kubenswrapper[4651]: I1126 15:11:52.312286 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:11:52 crc kubenswrapper[4651]: I1126 15:11:52.324660 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a3b8c2db-ce7f-48ce-9fd1-d55b5583773e-etc-swift\") pod \"swift-storage-0\" (UID: \"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e\") " pod="openstack/swift-storage-0" Nov 26 15:11:52 crc kubenswrapper[4651]: I1126 15:11:52.441445 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:52.669999 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"427f2b03-d59e-4197-8d5c-3b3d7c1692ed","Type":"ContainerStarted","Data":"7c61bbc2f9b34a8e398397e5b5e58596a30ff3c35f231634a6df5f14a6820f86"} Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:52.670326 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"427f2b03-d59e-4197-8d5c-3b3d7c1692ed","Type":"ContainerStarted","Data":"e34d7f1dc4bb8d242f598372c7fcf7d716f3f87918d869cdca3718b8e1725322"} Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:52.707724 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.7077032279999997 podStartE2EDuration="2.707703228s" podCreationTimestamp="2025-11-26 15:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:52.698664021 +0000 UTC m=+1280.124411635" watchObservedRunningTime="2025-11-26 15:11:52.707703228 +0000 UTC m=+1280.133450842" Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:53.012779 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:53.681504 4651 generic.go:334] "Generic (PLEG): container finished" podID="427f2b03-d59e-4197-8d5c-3b3d7c1692ed" containerID="7c61bbc2f9b34a8e398397e5b5e58596a30ff3c35f231634a6df5f14a6820f86" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:53.681574 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"427f2b03-d59e-4197-8d5c-3b3d7c1692ed","Type":"ContainerDied","Data":"7c61bbc2f9b34a8e398397e5b5e58596a30ff3c35f231634a6df5f14a6820f86"} Nov 26 15:11:53 crc kubenswrapper[4651]: I1126 15:11:53.683772 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"3976173be312a5072f69fc3425c507684234f5ffefbe57f34f566fe422c9a2df"} Nov 26 15:11:54 crc kubenswrapper[4651]: I1126 15:11:54.696757 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"d19f9ba8dfe80b38b12ccc4f3ada151e32ac86393450d60de5fb4ee036855566"} Nov 26 15:11:54 crc kubenswrapper[4651]: I1126 15:11:54.697439 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"9f10b0adebd25939d26f90901b505a05bf7d2e1166bf40418c51443c5be32085"} Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.018763 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.063125 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir\") pod \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.063298 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access\") pod \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\" (UID: \"427f2b03-d59e-4197-8d5c-3b3d7c1692ed\") " Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.065016 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "427f2b03-d59e-4197-8d5c-3b3d7c1692ed" (UID: "427f2b03-d59e-4197-8d5c-3b3d7c1692ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.089238 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "427f2b03-d59e-4197-8d5c-3b3d7c1692ed" (UID: "427f2b03-d59e-4197-8d5c-3b3d7c1692ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.165809 4651 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.166166 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/427f2b03-d59e-4197-8d5c-3b3d7c1692ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.709824 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"427f2b03-d59e-4197-8d5c-3b3d7c1692ed","Type":"ContainerDied","Data":"e34d7f1dc4bb8d242f598372c7fcf7d716f3f87918d869cdca3718b8e1725322"} Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.710474 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34d7f1dc4bb8d242f598372c7fcf7d716f3f87918d869cdca3718b8e1725322" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.709868 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.715468 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"ca6f75512802bfd144f22a02ee515c257fcc5085a099be11dd5c83600996b406"} Nov 26 15:11:55 crc kubenswrapper[4651]: I1126 15:11:55.715519 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"f4f5559ce9783eec8f151d673a805d0d31aaf51402e3d2a9668878eda49d78de"} Nov 26 15:11:56 crc kubenswrapper[4651]: I1126 15:11:56.741194 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"a0e13fcf8018c402fcd66dc412c413cef59778e3368cada132a8f3ddca1a4aac"} Nov 26 15:11:57 crc kubenswrapper[4651]: I1126 15:11:57.754430 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"1d0fc89c2c7fa10e3dddd42417f9875e8b9458f19be0236fb56ddeb714843ae7"} Nov 26 15:11:57 crc kubenswrapper[4651]: I1126 15:11:57.754796 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"c132eb51a1eb929d1978ab95ef05e23e2d72fd15448245b6e02dc0c84e17e8bf"} Nov 26 15:11:57 crc kubenswrapper[4651]: I1126 15:11:57.754811 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"63b9f6a17619ddc5592d0ba03aada08cc8c551c238840740a86bae6e69b374a2"} Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.372584 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:11:58 crc kubenswrapper[4651]: E1126 15:11:58.373125 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427f2b03-d59e-4197-8d5c-3b3d7c1692ed" containerName="pruner" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.373149 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="427f2b03-d59e-4197-8d5c-3b3d7c1692ed" containerName="pruner" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.373394 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="427f2b03-d59e-4197-8d5c-3b3d7c1692ed" containerName="pruner" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.376772 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.381171 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.382491 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.384840 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.534349 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.534394 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.534540 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.636523 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.636769 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.636803 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.636637 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.636868 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.660232 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access\") pod \"installer-9-crc\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.721583 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:11:58 crc kubenswrapper[4651]: I1126 15:11:58.769012 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"394b2e4a60e5e45a380a900915170f4ad0aac173d0e9fdcc7b1755336ea5feb9"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.132980 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.133346 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.226613 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:11:59 crc kubenswrapper[4651]: W1126 15:11:59.239374 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode4bbb04d_cfcf_46bd_9f9a_2700a64b37c2.slice/crio-f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148 WatchSource:0}: Error finding container f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148: Status 404 returned error can't find the container with id f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148 Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.786833 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"a0b42686864334e189c6d7f3d65f0e2621639dae9a7f6ff9502eb8970008babc"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.787195 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"a0f44602e278031af2e63442993cc48d1b463fa492bfca8092ecb0b95769ff25"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.787209 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"997cd4438f22468c3bcebe3c5790c770ed0285cb3aaedd4779492b6d3dda5d8e"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.787221 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"c4ccd64a659718c54995a6cae4389f7988319ed730bea40496160b80e2b48384"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.787230 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"54e0a6e8133b111ad6a65a2c1afa5979ef03565a57ef3da477b76d1fa2f3021c"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.788803 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2","Type":"ContainerStarted","Data":"7044ba9af971cb3dc8203577b229bf9b85775ccd0f5018e496100fbde1161145"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.788825 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2","Type":"ContainerStarted","Data":"f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148"} Nov 26 15:11:59 crc kubenswrapper[4651]: I1126 15:11:59.806365 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.806342904 podStartE2EDuration="1.806342904s" podCreationTimestamp="2025-11-26 15:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:59.804602027 +0000 UTC m=+1287.230349631" watchObservedRunningTime="2025-11-26 15:11:59.806342904 +0000 UTC m=+1287.232090518" Nov 26 15:12:00 crc kubenswrapper[4651]: I1126 15:12:00.806433 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a3b8c2db-ce7f-48ce-9fd1-d55b5583773e","Type":"ContainerStarted","Data":"d372ebc019597fb3db50e6d50a42d7c106303d6dbea775cee9ec691d4f88719e"} Nov 26 15:12:00 crc kubenswrapper[4651]: I1126 15:12:00.865940 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=376.451333876 podStartE2EDuration="6m21.865914827s" podCreationTimestamp="2025-11-26 15:05:39 +0000 UTC" firstStartedPulling="2025-11-26 15:11:53.037268982 +0000 UTC m=+1280.463016586" lastFinishedPulling="2025-11-26 15:11:58.451849913 +0000 UTC m=+1285.877597537" observedRunningTime="2025-11-26 15:12:00.858940975 +0000 UTC m=+1288.284688609" watchObservedRunningTime="2025-11-26 15:12:00.865914827 +0000 UTC m=+1288.291662431" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.265825 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85649f948c-kg5nr"] Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.269708 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.293541 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.293920 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-sb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.294015 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-svc\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.294056 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k76z\" (UniqueName: \"kubernetes.io/projected/3bc609df-69f6-442d-80fb-f070eb8b674c-kube-api-access-8k76z\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.294078 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-nb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.294129 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-config\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.294144 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-swift-storage-0\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.302596 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85649f948c-kg5nr"] Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395277 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-sb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395405 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-svc\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395429 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k76z\" (UniqueName: \"kubernetes.io/projected/3bc609df-69f6-442d-80fb-f070eb8b674c-kube-api-access-8k76z\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395452 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-nb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395503 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-config\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.395521 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-swift-storage-0\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.396560 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-svc\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.396929 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-sb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.397348 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-dns-swift-storage-0\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.397972 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-ovsdbserver-nb\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.398383 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc609df-69f6-442d-80fb-f070eb8b674c-config\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.433947 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k76z\" (UniqueName: \"kubernetes.io/projected/3bc609df-69f6-442d-80fb-f070eb8b674c-kube-api-access-8k76z\") pod \"dnsmasq-dns-85649f948c-kg5nr\" (UID: \"3bc609df-69f6-442d-80fb-f070eb8b674c\") " pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:01 crc kubenswrapper[4651]: I1126 15:12:01.613720 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:02 crc kubenswrapper[4651]: I1126 15:12:02.132402 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85649f948c-kg5nr"] Nov 26 15:12:02 crc kubenswrapper[4651]: I1126 15:12:02.832003 4651 generic.go:334] "Generic (PLEG): container finished" podID="3bc609df-69f6-442d-80fb-f070eb8b674c" containerID="a1e0cf10191ff422e76fb9b63f386fb2f2cd0ba57af5bcba4828fedfad8b68e8" exitCode=0 Nov 26 15:12:02 crc kubenswrapper[4651]: I1126 15:12:02.832104 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" event={"ID":"3bc609df-69f6-442d-80fb-f070eb8b674c","Type":"ContainerDied","Data":"a1e0cf10191ff422e76fb9b63f386fb2f2cd0ba57af5bcba4828fedfad8b68e8"} Nov 26 15:12:02 crc kubenswrapper[4651]: I1126 15:12:02.832354 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" event={"ID":"3bc609df-69f6-442d-80fb-f070eb8b674c","Type":"ContainerStarted","Data":"304e4cb56604d8d7e4c062bbee8ec9bf948dabb8fcd58c94acce537179c0b9f1"} Nov 26 15:12:03 crc kubenswrapper[4651]: I1126 15:12:03.845317 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" event={"ID":"3bc609df-69f6-442d-80fb-f070eb8b674c","Type":"ContainerStarted","Data":"9223a059f1220882f8f934ebd0c7b37ab4e8e09a8f0e70cbbbb1e910052bec21"} Nov 26 15:12:03 crc kubenswrapper[4651]: I1126 15:12:03.845567 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:03 crc kubenswrapper[4651]: I1126 15:12:03.866050 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" podStartSLOduration=2.866000572 podStartE2EDuration="2.866000572s" podCreationTimestamp="2025-11-26 15:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:12:03.861262613 +0000 UTC m=+1291.287010237" watchObservedRunningTime="2025-11-26 15:12:03.866000572 +0000 UTC m=+1291.291748176" Nov 26 15:12:11 crc kubenswrapper[4651]: I1126 15:12:11.615351 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85649f948c-kg5nr" Nov 26 15:12:11 crc kubenswrapper[4651]: I1126 15:12:11.685426 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:12:11 crc kubenswrapper[4651]: I1126 15:12:11.685991 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="dnsmasq-dns" containerID="cri-o://436eabcc04ab03bec5e3a58f278a272b9c4d282d2fdaf634f3a33425311680dd" gracePeriod=10 Nov 26 15:12:11 crc kubenswrapper[4651]: I1126 15:12:11.919286 4651 generic.go:334] "Generic (PLEG): container finished" podID="8d312157-9607-429b-bcb0-c6bc126938a8" containerID="436eabcc04ab03bec5e3a58f278a272b9c4d282d2fdaf634f3a33425311680dd" exitCode=0 Nov 26 15:12:11 crc kubenswrapper[4651]: I1126 15:12:11.919324 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" event={"ID":"8d312157-9607-429b-bcb0-c6bc126938a8","Type":"ContainerDied","Data":"436eabcc04ab03bec5e3a58f278a272b9c4d282d2fdaf634f3a33425311680dd"} Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.199480 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.316366 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb\") pod \"8d312157-9607-429b-bcb0-c6bc126938a8\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.316474 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfcbb\" (UniqueName: \"kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb\") pod \"8d312157-9607-429b-bcb0-c6bc126938a8\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.316608 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc\") pod \"8d312157-9607-429b-bcb0-c6bc126938a8\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.316675 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config\") pod \"8d312157-9607-429b-bcb0-c6bc126938a8\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.316713 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb\") pod \"8d312157-9607-429b-bcb0-c6bc126938a8\" (UID: \"8d312157-9607-429b-bcb0-c6bc126938a8\") " Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.322700 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb" (OuterVolumeSpecName: "kube-api-access-xfcbb") pod "8d312157-9607-429b-bcb0-c6bc126938a8" (UID: "8d312157-9607-429b-bcb0-c6bc126938a8"). InnerVolumeSpecName "kube-api-access-xfcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.367663 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d312157-9607-429b-bcb0-c6bc126938a8" (UID: "8d312157-9607-429b-bcb0-c6bc126938a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.371711 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d312157-9607-429b-bcb0-c6bc126938a8" (UID: "8d312157-9607-429b-bcb0-c6bc126938a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.384211 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config" (OuterVolumeSpecName: "config") pod "8d312157-9607-429b-bcb0-c6bc126938a8" (UID: "8d312157-9607-429b-bcb0-c6bc126938a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.384397 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d312157-9607-429b-bcb0-c6bc126938a8" (UID: "8d312157-9607-429b-bcb0-c6bc126938a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.422847 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfcbb\" (UniqueName: \"kubernetes.io/projected/8d312157-9607-429b-bcb0-c6bc126938a8-kube-api-access-xfcbb\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.422912 4651 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.422923 4651 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.422932 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.422941 4651 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d312157-9607-429b-bcb0-c6bc126938a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.933653 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" event={"ID":"8d312157-9607-429b-bcb0-c6bc126938a8","Type":"ContainerDied","Data":"4a019315beae501f321c4354412f3bc85a559207f76f815bd93fc8029c0be0f5"} Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.933701 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95bd95597-dqpwc" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.933712 4651 scope.go:117] "RemoveContainer" containerID="436eabcc04ab03bec5e3a58f278a272b9c4d282d2fdaf634f3a33425311680dd" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.961723 4651 scope.go:117] "RemoveContainer" containerID="61dfba3a45df4f5ed7c88d6c6185d7884ccf14eb957fcb14c61e493e1302fad2" Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.977634 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:12:12 crc kubenswrapper[4651]: I1126 15:12:12.987604 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95bd95597-dqpwc"] Nov 26 15:12:13 crc kubenswrapper[4651]: I1126 15:12:13.412529 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" path="/var/lib/kubelet/pods/8d312157-9607-429b-bcb0-c6bc126938a8/volumes" Nov 26 15:12:20 crc kubenswrapper[4651]: E1126 15:12:20.567921 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etc-swift], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openstack/swift-proxy-6978d54687-jsqtl" podUID="09fca043-ad27-4285-8894-522bc6cc68f4" Nov 26 15:12:21 crc kubenswrapper[4651]: I1126 15:12:21.035599 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.073028 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:12:22 crc kubenswrapper[4651]: E1126 15:12:22.073762 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="dnsmasq-dns" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.073777 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="dnsmasq-dns" Nov 26 15:12:22 crc kubenswrapper[4651]: E1126 15:12:22.073809 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="init" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.073817 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="init" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.074076 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d312157-9607-429b-bcb0-c6bc126938a8" containerName="dnsmasq-dns" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.075615 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.087663 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.211494 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvrvq\" (UniqueName: \"kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.211845 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.211975 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.313884 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvrvq\" (UniqueName: \"kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.314022 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.314085 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.314537 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.314698 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.339853 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvrvq\" (UniqueName: \"kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq\") pod \"redhat-operators-cw96g\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.402633 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:22 crc kubenswrapper[4651]: I1126 15:12:22.854552 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:12:23 crc kubenswrapper[4651]: I1126 15:12:23.052212 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerStarted","Data":"c7ecfe5a2d40d1df35b36b2164ec827cbc4c64cbb3a0e7ed45c6e8f51ea73711"} Nov 26 15:12:23 crc kubenswrapper[4651]: I1126 15:12:23.640850 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:23 crc kubenswrapper[4651]: I1126 15:12:23.647943 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09fca043-ad27-4285-8894-522bc6cc68f4-etc-swift\") pod \"swift-proxy-6978d54687-jsqtl\" (UID: \"09fca043-ad27-4285-8894-522bc6cc68f4\") " pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:23 crc kubenswrapper[4651]: I1126 15:12:23.737508 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:24 crc kubenswrapper[4651]: I1126 15:12:24.063996 4651 generic.go:334] "Generic (PLEG): container finished" podID="5c231a23-6a60-4022-9e05-66aee576b01a" containerID="5c1160bddb2e68af62f7943a9ee90d09eaafcb50ea36a6bcba5b6844e8d27c70" exitCode=0 Nov 26 15:12:24 crc kubenswrapper[4651]: I1126 15:12:24.064101 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerDied","Data":"5c1160bddb2e68af62f7943a9ee90d09eaafcb50ea36a6bcba5b6844e8d27c70"} Nov 26 15:12:24 crc kubenswrapper[4651]: I1126 15:12:24.270437 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6978d54687-jsqtl"] Nov 26 15:12:24 crc kubenswrapper[4651]: W1126 15:12:24.279872 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09fca043_ad27_4285_8894_522bc6cc68f4.slice/crio-e131553064172a377a93f3a364d99c755cf4350c9283a71815f6670cb693ebbc WatchSource:0}: Error finding container e131553064172a377a93f3a364d99c755cf4350c9283a71815f6670cb693ebbc: Status 404 returned error can't find the container with id e131553064172a377a93f3a364d99c755cf4350c9283a71815f6670cb693ebbc Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.079919 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6978d54687-jsqtl" event={"ID":"09fca043-ad27-4285-8894-522bc6cc68f4","Type":"ContainerStarted","Data":"be6a40d1bae4eb85a76f8076a87f987f4d54a22663232205935748580f3f309f"} Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.080411 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6978d54687-jsqtl" event={"ID":"09fca043-ad27-4285-8894-522bc6cc68f4","Type":"ContainerStarted","Data":"1cb1e202407aed90dc8a9f160ffdd6ca6df06c95361a4fb3ed72eb4dc54d8a8c"} Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.080436 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6978d54687-jsqtl" event={"ID":"09fca043-ad27-4285-8894-522bc6cc68f4","Type":"ContainerStarted","Data":"e131553064172a377a93f3a364d99c755cf4350c9283a71815f6670cb693ebbc"} Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.080599 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.081107 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:25 crc kubenswrapper[4651]: I1126 15:12:25.116469 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6978d54687-jsqtl" podStartSLOduration=252.116447014 podStartE2EDuration="4m12.116447014s" podCreationTimestamp="2025-11-26 15:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:12:25.105264193 +0000 UTC m=+1312.531011807" watchObservedRunningTime="2025-11-26 15:12:25.116447014 +0000 UTC m=+1312.542194618" Nov 26 15:12:26 crc kubenswrapper[4651]: I1126 15:12:26.089891 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerStarted","Data":"04f06a03bca7ddabaec37db96bfbae8632a7e17ec860e3c4aa07c713e7b9c28e"} Nov 26 15:12:29 crc kubenswrapper[4651]: I1126 15:12:29.133262 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:12:29 crc kubenswrapper[4651]: I1126 15:12:29.133832 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:12:29 crc kubenswrapper[4651]: I1126 15:12:29.133889 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:12:29 crc kubenswrapper[4651]: I1126 15:12:29.134778 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:12:29 crc kubenswrapper[4651]: I1126 15:12:29.134854 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919" gracePeriod=600 Nov 26 15:12:30 crc kubenswrapper[4651]: I1126 15:12:30.129455 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919" exitCode=0 Nov 26 15:12:30 crc kubenswrapper[4651]: I1126 15:12:30.129536 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919"} Nov 26 15:12:30 crc kubenswrapper[4651]: I1126 15:12:30.130148 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14"} Nov 26 15:12:30 crc kubenswrapper[4651]: I1126 15:12:30.130228 4651 scope.go:117] "RemoveContainer" containerID="1b324081402e9e9abd725d1ece3f18cded052636ec277c013a1f5a3dea9b3cf7" Nov 26 15:12:31 crc kubenswrapper[4651]: I1126 15:12:31.140110 4651 generic.go:334] "Generic (PLEG): container finished" podID="5c231a23-6a60-4022-9e05-66aee576b01a" containerID="04f06a03bca7ddabaec37db96bfbae8632a7e17ec860e3c4aa07c713e7b9c28e" exitCode=0 Nov 26 15:12:31 crc kubenswrapper[4651]: I1126 15:12:31.140209 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerDied","Data":"04f06a03bca7ddabaec37db96bfbae8632a7e17ec860e3c4aa07c713e7b9c28e"} Nov 26 15:12:32 crc kubenswrapper[4651]: I1126 15:12:32.153323 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerStarted","Data":"b516d291a20013b8f1eee0a170ed9a7a9a403adee5be808c3cae2495abf3a320"} Nov 26 15:12:32 crc kubenswrapper[4651]: I1126 15:12:32.172026 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cw96g" podStartSLOduration=2.317649814 podStartE2EDuration="10.172006348s" podCreationTimestamp="2025-11-26 15:12:22 +0000 UTC" firstStartedPulling="2025-11-26 15:12:24.066383034 +0000 UTC m=+1311.492130638" lastFinishedPulling="2025-11-26 15:12:31.920739578 +0000 UTC m=+1319.346487172" observedRunningTime="2025-11-26 15:12:32.170204995 +0000 UTC m=+1319.595952619" watchObservedRunningTime="2025-11-26 15:12:32.172006348 +0000 UTC m=+1319.597753952" Nov 26 15:12:32 crc kubenswrapper[4651]: I1126 15:12:32.402831 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:32 crc kubenswrapper[4651]: I1126 15:12:32.402879 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:12:33 crc kubenswrapper[4651]: I1126 15:12:33.456782 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cw96g" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" probeResult="failure" output=< Nov 26 15:12:33 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:12:33 crc kubenswrapper[4651]: > Nov 26 15:12:33 crc kubenswrapper[4651]: I1126 15:12:33.743053 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:33 crc kubenswrapper[4651]: I1126 15:12:33.743879 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6978d54687-jsqtl" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.167768 4651 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.169264 4651 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170475 4651 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170608 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170849 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e" gracePeriod=15 Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170898 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1" gracePeriod=15 Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170972 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd" gracePeriod=15 Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.170981 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497" gracePeriod=15 Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.171152 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336" gracePeriod=15 Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172464 4651 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172827 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172842 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172856 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172863 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172871 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172877 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172919 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172926 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172935 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172942 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172954 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172961 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.172983 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.172989 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173228 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173247 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173259 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173270 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173293 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.173307 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.215658 4651 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254157 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254220 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254307 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254406 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254451 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254506 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254533 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.254704 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.355942 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356057 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356081 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356122 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356141 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356157 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356177 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356194 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356279 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356313 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356335 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356355 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356374 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356392 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356412 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.356431 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: I1126 15:12:37.516675 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:37 crc kubenswrapper[4651]: E1126 15:12:37.556064 4651 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b973abb835ef8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,LastTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.236120 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.242916 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.243986 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336" exitCode=0 Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.244027 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd" exitCode=0 Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.244056 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1" exitCode=0 Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.244066 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497" exitCode=2 Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.244064 4651 scope.go:117] "RemoveContainer" containerID="fb93b59a5a145c7430f6a0d2d20a52b82640be14fdbe0b3a09982193b8c6f23a" Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.248272 4651 generic.go:334] "Generic (PLEG): container finished" podID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" containerID="7044ba9af971cb3dc8203577b229bf9b85775ccd0f5018e496100fbde1161145" exitCode=0 Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.248370 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2","Type":"ContainerDied","Data":"7044ba9af971cb3dc8203577b229bf9b85775ccd0f5018e496100fbde1161145"} Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.249592 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.250084 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9"} Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.250139 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"be25a2e0ada10377018a1bb5384f52f186433fa47df6a285b9a6a0e6bbd0cbca"} Nov 26 15:12:38 crc kubenswrapper[4651]: I1126 15:12:38.250586 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:38 crc kubenswrapper[4651]: E1126 15:12:38.250700 4651 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:12:38 crc kubenswrapper[4651]: E1126 15:12:38.439083 4651 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openstack/mysql-db-openstack-cell1-galera-0: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/persistentvolumeclaims/mysql-db-openstack-cell1-galera-0\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openstack/openstack-cell1-galera-0" volumeName="mysql-db" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.324930 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.865295 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.866420 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.874854 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.875784 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.876607 4651 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.876968 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906143 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock\") pod \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906205 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir\") pod \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906255 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access\") pod \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\" (UID: \"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2\") " Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906227 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock" (OuterVolumeSpecName: "var-lock") pod "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" (UID: "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906354 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" (UID: "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.906733 4651 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:39 crc kubenswrapper[4651]: I1126 15:12:39.911754 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" (UID: "e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.007977 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008057 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008101 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008180 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008292 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008399 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008680 4651 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008729 4651 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008738 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008748 4651 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.008755 4651 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.342472 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.345338 4651 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e" exitCode=0 Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.345442 4651 scope.go:117] "RemoveContainer" containerID="7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.345410 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.349182 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2","Type":"ContainerDied","Data":"f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148"} Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.349222 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72984af8379e1082d4e0cfb9b53bb17b67059ac63878d415d5692936b3eb148" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.349237 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.364661 4651 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.365257 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.376900 4651 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.377285 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.384310 4651 scope.go:117] "RemoveContainer" containerID="1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.408009 4651 scope.go:117] "RemoveContainer" containerID="663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.421519 4651 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b973abb835ef8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,LastTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.441211 4651 scope.go:117] "RemoveContainer" containerID="9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.478631 4651 scope.go:117] "RemoveContainer" containerID="0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.506611 4651 scope.go:117] "RemoveContainer" containerID="1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.543451 4651 scope.go:117] "RemoveContainer" containerID="7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.544750 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\": container with ID starting with 7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336 not found: ID does not exist" containerID="7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.544800 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336"} err="failed to get container status \"7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\": rpc error: code = NotFound desc = could not find container \"7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336\": container with ID starting with 7f449d92ae66ef7b09a1c5790514f9df6d5c23275cf96789602c0d52aea6c336 not found: ID does not exist" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.545044 4651 scope.go:117] "RemoveContainer" containerID="1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.545544 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\": container with ID starting with 1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd not found: ID does not exist" containerID="1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.545584 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd"} err="failed to get container status \"1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\": rpc error: code = NotFound desc = could not find container \"1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd\": container with ID starting with 1ce6f42457a0e670dc25d3f3f4479cd13b4a9eddafcfe533a0a9747a747073cd not found: ID does not exist" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.545641 4651 scope.go:117] "RemoveContainer" containerID="663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.546410 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\": container with ID starting with 663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1 not found: ID does not exist" containerID="663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.546439 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1"} err="failed to get container status \"663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\": rpc error: code = NotFound desc = could not find container \"663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1\": container with ID starting with 663bf6e333b24ed149673de5ad1f6fc8a6183083cbfd82ab1e9f181412e8a1b1 not found: ID does not exist" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.546460 4651 scope.go:117] "RemoveContainer" containerID="9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.546755 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\": container with ID starting with 9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497 not found: ID does not exist" containerID="9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.546785 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497"} err="failed to get container status \"9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\": rpc error: code = NotFound desc = could not find container \"9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497\": container with ID starting with 9987b20743165d8d926d0aa6a8087677ec8d2e809868e290b10c4a2ff9814497 not found: ID does not exist" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.546864 4651 scope.go:117] "RemoveContainer" containerID="0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.547345 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\": container with ID starting with 0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e not found: ID does not exist" containerID="0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.547406 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e"} err="failed to get container status \"0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\": rpc error: code = NotFound desc = could not find container \"0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e\": container with ID starting with 0236d8a97d40d92c9daa60c5f6054546b1e9e3df90d073f446cc28b6358aa03e not found: ID does not exist" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.547440 4651 scope.go:117] "RemoveContainer" containerID="1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d" Nov 26 15:12:40 crc kubenswrapper[4651]: E1126 15:12:40.548077 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\": container with ID starting with 1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d not found: ID does not exist" containerID="1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d" Nov 26 15:12:40 crc kubenswrapper[4651]: I1126 15:12:40.548099 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d"} err="failed to get container status \"1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\": rpc error: code = NotFound desc = could not find container \"1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d\": container with ID starting with 1201c6438e3e7996f9b3009c3ce58e00c58662031fb899a1e813d3217df8015d not found: ID does not exist" Nov 26 15:12:41 crc kubenswrapper[4651]: I1126 15:12:41.421655 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 26 15:12:43 crc kubenswrapper[4651]: I1126 15:12:43.410542 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:43 crc kubenswrapper[4651]: I1126 15:12:43.461351 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cw96g" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" probeResult="failure" output=< Nov 26 15:12:43 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:12:43 crc kubenswrapper[4651]: > Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.921083 4651 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.921722 4651 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.921932 4651 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.922119 4651 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.922297 4651 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:44 crc kubenswrapper[4651]: I1126 15:12:44.922319 4651 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 26 15:12:44 crc kubenswrapper[4651]: E1126 15:12:44.922697 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="200ms" Nov 26 15:12:45 crc kubenswrapper[4651]: E1126 15:12:45.124106 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="400ms" Nov 26 15:12:45 crc kubenswrapper[4651]: I1126 15:12:45.278816 4651 scope.go:117] "RemoveContainer" containerID="872f6cedb7fd99f3da10b91a88fe1cba056c3c05144a395a57e9aa6d6db1ee9c" Nov 26 15:12:45 crc kubenswrapper[4651]: I1126 15:12:45.300718 4651 scope.go:117] "RemoveContainer" containerID="e2432d3700fe0e8c82c27265682f774fc492ca087f3b84a2b8d108a5881130f4" Nov 26 15:12:45 crc kubenswrapper[4651]: I1126 15:12:45.365661 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 15:12:45 crc kubenswrapper[4651]: E1126 15:12:45.525360 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="800ms" Nov 26 15:12:46 crc kubenswrapper[4651]: E1126 15:12:46.326782 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="1.6s" Nov 26 15:12:47 crc kubenswrapper[4651]: I1126 15:12:47.439085 4651 generic.go:334] "Generic (PLEG): container finished" podID="b24122be-246e-4dc9-a3ad-4ca2392a4660" containerID="d68457aa0182b7b472e788b09f60b13e7f5345e7a22b5bc1daa6ec33f2b70b24" exitCode=1 Nov 26 15:12:47 crc kubenswrapper[4651]: I1126 15:12:47.439170 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerDied","Data":"d68457aa0182b7b472e788b09f60b13e7f5345e7a22b5bc1daa6ec33f2b70b24"} Nov 26 15:12:47 crc kubenswrapper[4651]: I1126 15:12:47.439804 4651 scope.go:117] "RemoveContainer" containerID="d68457aa0182b7b472e788b09f60b13e7f5345e7a22b5bc1daa6ec33f2b70b24" Nov 26 15:12:47 crc kubenswrapper[4651]: I1126 15:12:47.440497 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: I1126 15:12:47.440918 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.711087 4651 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:12:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:12:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:12:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:12:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.712062 4651 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.712476 4651 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.712802 4651 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.713340 4651 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.713361 4651 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:12:47 crc kubenswrapper[4651]: E1126 15:12:47.929852 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="3.2s" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.180507 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.180869 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.450837 4651 generic.go:334] "Generic (PLEG): container finished" podID="b24122be-246e-4dc9-a3ad-4ca2392a4660" containerID="c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e" exitCode=1 Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.450973 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerDied","Data":"c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e"} Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.451021 4651 scope.go:117] "RemoveContainer" containerID="d68457aa0182b7b472e788b09f60b13e7f5345e7a22b5bc1daa6ec33f2b70b24" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.451770 4651 scope.go:117] "RemoveContainer" containerID="c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e" Nov 26 15:12:48 crc kubenswrapper[4651]: E1126 15:12:48.452136 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660)\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.452239 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.452507 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.459159 4651 generic.go:334] "Generic (PLEG): container finished" podID="f688796e-89d5-4da8-8dc7-786c5940b853" containerID="836f5fbd351ff1aac2a02ee27c4f73eeccc67aacdd1c8c67b75bb0115f646551" exitCode=1 Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.459294 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerDied","Data":"836f5fbd351ff1aac2a02ee27c4f73eeccc67aacdd1c8c67b75bb0115f646551"} Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.460252 4651 scope.go:117] "RemoveContainer" containerID="836f5fbd351ff1aac2a02ee27c4f73eeccc67aacdd1c8c67b75bb0115f646551" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.460389 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.461231 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.461543 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.462288 4651 generic.go:334] "Generic (PLEG): container finished" podID="14110a58-3dd5-4827-8a86-d4c0fc377b97" containerID="3dfdb12a088d907a58eb0d8f09b3f0260add56c0bdb3a616b591972ca4cc60e3" exitCode=1 Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.462325 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerDied","Data":"3dfdb12a088d907a58eb0d8f09b3f0260add56c0bdb3a616b591972ca4cc60e3"} Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.462794 4651 scope.go:117] "RemoveContainer" containerID="3dfdb12a088d907a58eb0d8f09b3f0260add56c0bdb3a616b591972ca4cc60e3" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.463242 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.467283 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.468008 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:48 crc kubenswrapper[4651]: I1126 15:12:48.468391 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.473641 4651 scope.go:117] "RemoveContainer" containerID="c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.474167 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: E1126 15:12:49.474690 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660)\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.474683 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.474960 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.475226 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.475680 4651 generic.go:334] "Generic (PLEG): container finished" podID="f688796e-89d5-4da8-8dc7-786c5940b853" containerID="2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea" exitCode=1 Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.475763 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerDied","Data":"2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea"} Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.476554 4651 scope.go:117] "RemoveContainer" containerID="836f5fbd351ff1aac2a02ee27c4f73eeccc67aacdd1c8c67b75bb0115f646551" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.477141 4651 scope.go:117] "RemoveContainer" containerID="2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.477251 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: E1126 15:12:49.477385 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-5b5d786cf6-wsrgh_metallb-system(f688796e-89d5-4da8-8dc7-786c5940b853)\"" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.477478 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.477718 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.477985 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.479371 4651 generic.go:334] "Generic (PLEG): container finished" podID="14110a58-3dd5-4827-8a86-d4c0fc377b97" containerID="8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1" exitCode=1 Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.479399 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerDied","Data":"8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1"} Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.479888 4651 scope.go:117] "RemoveContainer" containerID="8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.480120 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: E1126 15:12:49.480179 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97)\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.480630 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.480955 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.481394 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:49 crc kubenswrapper[4651]: I1126 15:12:49.534663 4651 scope.go:117] "RemoveContainer" containerID="3dfdb12a088d907a58eb0d8f09b3f0260add56c0bdb3a616b591972ca4cc60e3" Nov 26 15:12:50 crc kubenswrapper[4651]: E1126 15:12:50.422639 4651 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b973abb835ef8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,LastTimestamp:2025-11-26 15:12:37.555527416 +0000 UTC m=+1324.981275040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:12:51 crc kubenswrapper[4651]: E1126 15:12:51.131857 4651 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="6.4s" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.503011 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.503107 4651 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6" exitCode=1 Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.503141 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6"} Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.503867 4651 scope.go:117] "RemoveContainer" containerID="5ac3adc10715786992515543ed414422c509b2deefee47097229ed25286f3db6" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.504181 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.504525 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.504865 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.505425 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:51 crc kubenswrapper[4651]: I1126 15:12:51.505747 4651 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.401459 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.403307 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.403872 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.405164 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.405557 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.406419 4651 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.431991 4651 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.432214 4651 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:52 crc kubenswrapper[4651]: E1126 15:12:52.433255 4651 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.435203 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:52 crc kubenswrapper[4651]: W1126 15:12:52.479398 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-eef4a53d74c417e8b68ef78467aee4213518bab1e56c185559f77d56ee3fef91 WatchSource:0}: Error finding container eef4a53d74c417e8b68ef78467aee4213518bab1e56c185559f77d56ee3fef91: Status 404 returned error can't find the container with id eef4a53d74c417e8b68ef78467aee4213518bab1e56c185559f77d56ee3fef91 Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.525421 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eef4a53d74c417e8b68ef78467aee4213518bab1e56c185559f77d56ee3fef91"} Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.530064 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.530161 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a5e96db1f5e4d7077c95a36c10ab535ea388be9fb642fadc1c740791637ffef8"} Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.532963 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.533769 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.534373 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.535268 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:52 crc kubenswrapper[4651]: I1126 15:12:52.535587 4651 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.415006 4651 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.415951 4651 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.416624 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.417117 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.417848 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.418434 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.486555 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cw96g" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" probeResult="failure" output=< Nov 26 15:12:53 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:12:53 crc kubenswrapper[4651]: > Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.540651 4651 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e149fb67f2b91db9d2397e2d733015cb91c80a48fa5f7637294f28f6dd2c521e" exitCode=0 Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.540700 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e149fb67f2b91db9d2397e2d733015cb91c80a48fa5f7637294f28f6dd2c521e"} Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.540981 4651 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.541015 4651 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:53 crc kubenswrapper[4651]: E1126 15:12:53.541479 4651 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.541487 4651 status_manager.go:851] "Failed to get status for pod" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/octavia-operator-controller-manager-64cdc6ff96-x9mdd\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.542678 4651 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.543002 4651 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.543349 4651 status_manager.go:851] "Failed to get status for pod" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5b5d786cf6-wsrgh\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.543611 4651 status_manager.go:851] "Failed to get status for pod" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:53 crc kubenswrapper[4651]: I1126 15:12:53.544064 4651 status_manager.go:851] "Failed to get status for pod" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/ironic-operator-controller-manager-67cb4dc6d4-cggjs\": dial tcp 38.102.83.241:6443: connect: connection refused" Nov 26 15:12:54 crc kubenswrapper[4651]: I1126 15:12:54.560493 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dd50938bc6628fdf4a156ed48ee62cfaa0fdd4b07512c15f2c87e05f5eeb49e7"} Nov 26 15:12:54 crc kubenswrapper[4651]: I1126 15:12:54.561120 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59daa8457e6687d27da4246eff977643c2b65b13396680670ac009c5d3a60180"} Nov 26 15:12:54 crc kubenswrapper[4651]: I1126 15:12:54.561135 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd542adf9c74b4242af6410b93b81db40a61e4a6bcc2cb1046f428936f6ea6df"} Nov 26 15:12:54 crc kubenswrapper[4651]: I1126 15:12:54.888135 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:12:54 crc kubenswrapper[4651]: I1126 15:12:54.894679 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.384455 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574067 4651 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574357 4651 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574253 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bef9772fd0e6a7b0f1c0e5c1d63364abb2eb5f618e3eb590292aef9abebf756"} Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574452 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574470 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:55 crc kubenswrapper[4651]: I1126 15:12:55.574481 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"500ef2ae485e1fb69a77848b1674181d99c6cc0b0bc3e22082aafbc284526738"} Nov 26 15:12:56 crc kubenswrapper[4651]: I1126 15:12:56.382134 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:12:56 crc kubenswrapper[4651]: I1126 15:12:56.382997 4651 scope.go:117] "RemoveContainer" containerID="2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea" Nov 26 15:12:56 crc kubenswrapper[4651]: E1126 15:12:56.383381 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-5b5d786cf6-wsrgh_metallb-system(f688796e-89d5-4da8-8dc7-786c5940b853)\"" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.435404 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.435745 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.440609 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.484733 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.485428 4651 scope.go:117] "RemoveContainer" containerID="8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1" Nov 26 15:12:57 crc kubenswrapper[4651]: E1126 15:12:57.485648 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97)\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.600993 4651 generic.go:334] "Generic (PLEG): container finished" podID="e50a607f-7a61-4a78-870a-297fa0daa977" containerID="c45d8db1ee32fbee85782fa6278a140842f2f14aee9b4287289b6612b0792535" exitCode=1 Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.601051 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" event={"ID":"e50a607f-7a61-4a78-870a-297fa0daa977","Type":"ContainerDied","Data":"c45d8db1ee32fbee85782fa6278a140842f2f14aee9b4287289b6612b0792535"} Nov 26 15:12:57 crc kubenswrapper[4651]: I1126 15:12:57.602198 4651 scope.go:117] "RemoveContainer" containerID="c45d8db1ee32fbee85782fa6278a140842f2f14aee9b4287289b6612b0792535" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.179998 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.181371 4651 scope.go:117] "RemoveContainer" containerID="c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.181490 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/readyz\": dial tcp 10.217.0.90:8081: connect: connection refused" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.351013 4651 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" podUID="8cd427a2-9759-460e-b86e-23e08dd7ba78" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.91:8081/readyz\": dial tcp 10.217.0.91:8081: connect: connection refused" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.616012 4651 generic.go:334] "Generic (PLEG): container finished" podID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" containerID="4aec82603b80b561de33cfb0129cc4a117978fa847ac8eb77ffb74b4e2c43db9" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.616403 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerDied","Data":"4aec82603b80b561de33cfb0129cc4a117978fa847ac8eb77ffb74b4e2c43db9"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.617105 4651 scope.go:117] "RemoveContainer" containerID="4aec82603b80b561de33cfb0129cc4a117978fa847ac8eb77ffb74b4e2c43db9" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.624276 4651 generic.go:334] "Generic (PLEG): container finished" podID="8271ec0d-f8ea-4c46-984f-95572691a379" containerID="ad887845c585d6e5bcef2dd124a04b5b952a0903b3ca9b4454446fdca2ea51c1" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.624356 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerDied","Data":"ad887845c585d6e5bcef2dd124a04b5b952a0903b3ca9b4454446fdca2ea51c1"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.624910 4651 scope.go:117] "RemoveContainer" containerID="ad887845c585d6e5bcef2dd124a04b5b952a0903b3ca9b4454446fdca2ea51c1" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.631632 4651 generic.go:334] "Generic (PLEG): container finished" podID="8cd427a2-9759-460e-b86e-23e08dd7ba78" containerID="0b711569ef0ffd586847bc5b97844ac34fdfc34c85d0e549c6958268777db4cd" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.631746 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" event={"ID":"8cd427a2-9759-460e-b86e-23e08dd7ba78","Type":"ContainerDied","Data":"0b711569ef0ffd586847bc5b97844ac34fdfc34c85d0e549c6958268777db4cd"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.632715 4651 scope.go:117] "RemoveContainer" containerID="0b711569ef0ffd586847bc5b97844ac34fdfc34c85d0e549c6958268777db4cd" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.638823 4651 generic.go:334] "Generic (PLEG): container finished" podID="e50a607f-7a61-4a78-870a-297fa0daa977" containerID="3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.638887 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" event={"ID":"e50a607f-7a61-4a78-870a-297fa0daa977","Type":"ContainerDied","Data":"3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.638927 4651 scope.go:117] "RemoveContainer" containerID="c45d8db1ee32fbee85782fa6278a140842f2f14aee9b4287289b6612b0792535" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.640822 4651 scope.go:117] "RemoveContainer" containerID="3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73" Nov 26 15:12:58 crc kubenswrapper[4651]: E1126 15:12:58.641079 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-5bcdd9fbc-vsb4g_openstack-operators(e50a607f-7a61-4a78-870a-297fa0daa977)\"" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" podUID="e50a607f-7a61-4a78-870a-297fa0daa977" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.646462 4651 generic.go:334] "Generic (PLEG): container finished" podID="b24122be-246e-4dc9-a3ad-4ca2392a4660" containerID="525c1ddb2a0b10ca086856e3c06fcf68d98b0ea9974fdb788725f75bb6ab1d01" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.646540 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerDied","Data":"525c1ddb2a0b10ca086856e3c06fcf68d98b0ea9974fdb788725f75bb6ab1d01"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.647306 4651 scope.go:117] "RemoveContainer" containerID="525c1ddb2a0b10ca086856e3c06fcf68d98b0ea9974fdb788725f75bb6ab1d01" Nov 26 15:12:58 crc kubenswrapper[4651]: E1126 15:12:58.647537 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660)\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.649747 4651 generic.go:334] "Generic (PLEG): container finished" podID="eed373f0-add9-4ae8-b5cc-ed711e79b5c5" containerID="a71712d16bc040e56e42409d28dfe7133b928d732e60f0bf7fd5948d3351fabe" exitCode=1 Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.650580 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerDied","Data":"a71712d16bc040e56e42409d28dfe7133b928d732e60f0bf7fd5948d3351fabe"} Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.651387 4651 scope.go:117] "RemoveContainer" containerID="a71712d16bc040e56e42409d28dfe7133b928d732e60f0bf7fd5948d3351fabe" Nov 26 15:12:58 crc kubenswrapper[4651]: I1126 15:12:58.861182 4651 scope.go:117] "RemoveContainer" containerID="c4e50192edbd84cd318dfe4063725741980c611b90c0c32692933c83d696a83e" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.661146 4651 generic.go:334] "Generic (PLEG): container finished" podID="ec10af15-dcf5-413d-87ef-0ca5a469b5fa" containerID="66d4e1a8a7464741d6291f866867bf3b5f18546445bfdc1338aff5d4014af56a" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.661218 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerDied","Data":"66d4e1a8a7464741d6291f866867bf3b5f18546445bfdc1338aff5d4014af56a"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.662700 4651 scope.go:117] "RemoveContainer" containerID="66d4e1a8a7464741d6291f866867bf3b5f18546445bfdc1338aff5d4014af56a" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.664204 4651 generic.go:334] "Generic (PLEG): container finished" podID="66532d04-3411-4813-ae53-4d635ee98911" containerID="254bb7226198214e8b2d36818fe4295bed20fbe71567f6521ae1043594715684" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.664264 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerDied","Data":"254bb7226198214e8b2d36818fe4295bed20fbe71567f6521ae1043594715684"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.664965 4651 scope.go:117] "RemoveContainer" containerID="254bb7226198214e8b2d36818fe4295bed20fbe71567f6521ae1043594715684" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.669885 4651 generic.go:334] "Generic (PLEG): container finished" podID="a8e49781-2e0b-476d-be9f-e17f05639447" containerID="5f4df4abc8cfc206de00d39d6be18c73e05ea9a7a2a6343dce8f6c26d5729dbf" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.669940 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerDied","Data":"5f4df4abc8cfc206de00d39d6be18c73e05ea9a7a2a6343dce8f6c26d5729dbf"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.670421 4651 scope.go:117] "RemoveContainer" containerID="5f4df4abc8cfc206de00d39d6be18c73e05ea9a7a2a6343dce8f6c26d5729dbf" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.672308 4651 generic.go:334] "Generic (PLEG): container finished" podID="8271ec0d-f8ea-4c46-984f-95572691a379" containerID="ae67209f929776e077fd695c2b58f547efadc1914e7822bd684b5c862e5df403" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.672332 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerDied","Data":"ae67209f929776e077fd695c2b58f547efadc1914e7822bd684b5c862e5df403"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.672464 4651 scope.go:117] "RemoveContainer" containerID="ad887845c585d6e5bcef2dd124a04b5b952a0903b3ca9b4454446fdca2ea51c1" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.672717 4651 scope.go:117] "RemoveContainer" containerID="ae67209f929776e077fd695c2b58f547efadc1914e7822bd684b5c862e5df403" Nov 26 15:12:59 crc kubenswrapper[4651]: E1126 15:12:59.672935 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-6fdcddb789-8h624_openstack-operators(8271ec0d-f8ea-4c46-984f-95572691a379)\"" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" podUID="8271ec0d-f8ea-4c46-984f-95572691a379" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.675304 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" event={"ID":"8cd427a2-9759-460e-b86e-23e08dd7ba78","Type":"ContainerStarted","Data":"eb647b9156b6f3ab2d1a11ee001809041a06d025e81ebbe2df019ffaa217776d"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.675467 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.677590 4651 generic.go:334] "Generic (PLEG): container finished" podID="85fb4e98-47db-403d-85e3-c2550cd47160" containerID="09100959ca7bdf7d783741de3afd6c235fa6e921771a8abea9d4cce719368185" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.677640 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerDied","Data":"09100959ca7bdf7d783741de3afd6c235fa6e921771a8abea9d4cce719368185"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.678559 4651 scope.go:117] "RemoveContainer" containerID="09100959ca7bdf7d783741de3afd6c235fa6e921771a8abea9d4cce719368185" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.679691 4651 generic.go:334] "Generic (PLEG): container finished" podID="eed373f0-add9-4ae8-b5cc-ed711e79b5c5" containerID="cce1cfc1869d9794a36a4e425555a8e364f6f3dc1b078a62e6480b16a2ca5ec2" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.679770 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerDied","Data":"cce1cfc1869d9794a36a4e425555a8e364f6f3dc1b078a62e6480b16a2ca5ec2"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.680196 4651 scope.go:117] "RemoveContainer" containerID="cce1cfc1869d9794a36a4e425555a8e364f6f3dc1b078a62e6480b16a2ca5ec2" Nov 26 15:12:59 crc kubenswrapper[4651]: E1126 15:12:59.680428 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-5d494799bf-v89cv_openstack-operators(eed373f0-add9-4ae8-b5cc-ed711e79b5c5)\"" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" podUID="eed373f0-add9-4ae8-b5cc-ed711e79b5c5" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.682020 4651 generic.go:334] "Generic (PLEG): container finished" podID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" containerID="7c46a1b0f4f43559adce3d1a67d26ec5eaac5f1152a61a729bcdb293d1a406ab" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.682086 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerDied","Data":"7c46a1b0f4f43559adce3d1a67d26ec5eaac5f1152a61a729bcdb293d1a406ab"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.682454 4651 scope.go:117] "RemoveContainer" containerID="7c46a1b0f4f43559adce3d1a67d26ec5eaac5f1152a61a729bcdb293d1a406ab" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.684497 4651 generic.go:334] "Generic (PLEG): container finished" podID="8a55643f-68a5-47ea-8b27-db437d3af215" containerID="29d6967888c7f1e1b4e4d2dd10230d66cb68af8b05e13bbbd4da168847ceb8ad" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.684554 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerDied","Data":"29d6967888c7f1e1b4e4d2dd10230d66cb68af8b05e13bbbd4da168847ceb8ad"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.684983 4651 scope.go:117] "RemoveContainer" containerID="29d6967888c7f1e1b4e4d2dd10230d66cb68af8b05e13bbbd4da168847ceb8ad" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.686638 4651 generic.go:334] "Generic (PLEG): container finished" podID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" containerID="716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.686683 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerDied","Data":"716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.687211 4651 scope.go:117] "RemoveContainer" containerID="716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9" Nov 26 15:12:59 crc kubenswrapper[4651]: E1126 15:12:59.687449 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb)\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.693123 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9981be4-751d-4c74-894a-698adad4c50f" containerID="e4e86fbd4a7f1e3ca3c6445da766092d64defe9efc96d1b66e6e2082bc18e9f7" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.693206 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerDied","Data":"e4e86fbd4a7f1e3ca3c6445da766092d64defe9efc96d1b66e6e2082bc18e9f7"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.695971 4651 scope.go:117] "RemoveContainer" containerID="e4e86fbd4a7f1e3ca3c6445da766092d64defe9efc96d1b66e6e2082bc18e9f7" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.696538 4651 generic.go:334] "Generic (PLEG): container finished" podID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" containerID="0104f495a20a880cdd317a6da9eb4040c59e5ada8c866b537adbb1edf8e011b1" exitCode=1 Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.696588 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerDied","Data":"0104f495a20a880cdd317a6da9eb4040c59e5ada8c866b537adbb1edf8e011b1"} Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.698134 4651 scope.go:117] "RemoveContainer" containerID="0104f495a20a880cdd317a6da9eb4040c59e5ada8c866b537adbb1edf8e011b1" Nov 26 15:12:59 crc kubenswrapper[4651]: I1126 15:12:59.900565 4651 scope.go:117] "RemoveContainer" containerID="a71712d16bc040e56e42409d28dfe7133b928d732e60f0bf7fd5948d3351fabe" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.031172 4651 scope.go:117] "RemoveContainer" containerID="4aec82603b80b561de33cfb0129cc4a117978fa847ac8eb77ffb74b4e2c43db9" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.590355 4651 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.641798 4651 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="21bde69f-6ea1-474c-b58d-226697f8cff1" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.706718 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerStarted","Data":"d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.707166 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.709461 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerStarted","Data":"89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.709677 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.730635 4651 generic.go:334] "Generic (PLEG): container finished" podID="ec10af15-dcf5-413d-87ef-0ca5a469b5fa" containerID="e5acf425e9f900ca037b72b89b3cc4f5a49f57d2d78af4fd208599e8eaf827b5" exitCode=1 Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.730689 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerDied","Data":"e5acf425e9f900ca037b72b89b3cc4f5a49f57d2d78af4fd208599e8eaf827b5"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.731023 4651 scope.go:117] "RemoveContainer" containerID="66d4e1a8a7464741d6291f866867bf3b5f18546445bfdc1338aff5d4014af56a" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.731705 4651 scope.go:117] "RemoveContainer" containerID="e5acf425e9f900ca037b72b89b3cc4f5a49f57d2d78af4fd208599e8eaf827b5" Nov 26 15:13:00 crc kubenswrapper[4651]: E1126 15:13:00.732058 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-7b64f4fb85-5jb5x_openstack-operators(ec10af15-dcf5-413d-87ef-0ca5a469b5fa)\"" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" podUID="ec10af15-dcf5-413d-87ef-0ca5a469b5fa" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.733278 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerStarted","Data":"3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.733609 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.751424 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerStarted","Data":"c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.751683 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.753815 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerStarted","Data":"8e6d638dd8622b109bdc989d4b082b8775a2ab52d359ae2030d59448fa73ddc4"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.759477 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerStarted","Data":"f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.760946 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.760989 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerStarted","Data":"52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.761781 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.762622 4651 generic.go:334] "Generic (PLEG): container finished" podID="dc5a51cf-b992-4542-8b00-2948ab513eed" containerID="6a8080f48adc920ab614914aeb492288994ad89d16d849f7fd0e64a5bc233d94" exitCode=1 Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.762678 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerDied","Data":"6a8080f48adc920ab614914aeb492288994ad89d16d849f7fd0e64a5bc233d94"} Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.763317 4651 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.763339 4651 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:13:00 crc kubenswrapper[4651]: I1126 15:13:00.763483 4651 scope.go:117] "RemoveContainer" containerID="6a8080f48adc920ab614914aeb492288994ad89d16d849f7fd0e64a5bc233d94" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.777330 4651 generic.go:334] "Generic (PLEG): container finished" podID="85fb4e98-47db-403d-85e3-c2550cd47160" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.778197 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerDied","Data":"d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.778251 4651 scope.go:117] "RemoveContainer" containerID="09100959ca7bdf7d783741de3afd6c235fa6e921771a8abea9d4cce719368185" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.779459 4651 scope.go:117] "RemoveContainer" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.779710 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-6b7f75547b-k4tq9_openstack-operators(85fb4e98-47db-403d-85e3-c2550cd47160)\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podUID="85fb4e98-47db-403d-85e3-c2550cd47160" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.785881 4651 generic.go:334] "Generic (PLEG): container finished" podID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" containerID="c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.785972 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerDied","Data":"c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.786740 4651 scope.go:117] "RemoveContainer" containerID="c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.787053 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-5b77f656f-pt9q8_openstack-operators(e5c0812c-3183-4f45-b6b9-d4975f8bb80a)\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" podUID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.811364 4651 generic.go:334] "Generic (PLEG): container finished" podID="66532d04-3411-4813-ae53-4d635ee98911" containerID="f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.811473 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerDied","Data":"f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.812027 4651 scope.go:117] "RemoveContainer" containerID="f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.812336 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-76cc84c6bb-8zvlb_openstack-operators(66532d04-3411-4813-ae53-4d635ee98911)\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" podUID="66532d04-3411-4813-ae53-4d635ee98911" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.813724 4651 generic.go:334] "Generic (PLEG): container finished" podID="99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6" containerID="8836958b25c08ba398d2599e613f6fe57f79fd51839e0bed6314f94ea6d1b99c" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.813812 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerDied","Data":"8836958b25c08ba398d2599e613f6fe57f79fd51839e0bed6314f94ea6d1b99c"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.814520 4651 scope.go:117] "RemoveContainer" containerID="8836958b25c08ba398d2599e613f6fe57f79fd51839e0bed6314f94ea6d1b99c" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.819517 4651 generic.go:334] "Generic (PLEG): container finished" podID="674eb001-765e-433a-89d6-2a82fb599a93" containerID="c563184a72db972980c8e7bee4ee080ab41751f8ee8d68204bfab0c762a9a579" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.819615 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" event={"ID":"674eb001-765e-433a-89d6-2a82fb599a93","Type":"ContainerDied","Data":"c563184a72db972980c8e7bee4ee080ab41751f8ee8d68204bfab0c762a9a579"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.820118 4651 scope.go:117] "RemoveContainer" containerID="c563184a72db972980c8e7bee4ee080ab41751f8ee8d68204bfab0c762a9a579" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.825311 4651 generic.go:334] "Generic (PLEG): container finished" podID="dc5a51cf-b992-4542-8b00-2948ab513eed" containerID="0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.825354 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerDied","Data":"0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.825943 4651 scope.go:117] "RemoveContainer" containerID="0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.826220 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7b4567c7cf-hmndm_openstack-operators(dc5a51cf-b992-4542-8b00-2948ab513eed)\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.828404 4651 generic.go:334] "Generic (PLEG): container finished" podID="53400076-0e4e-4e0b-b476-d4a1fd901631" containerID="e899c242aef2981c2a06f693cd6864e828910b21b4b430bb54f4fa497a21a270" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.828461 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerDied","Data":"e899c242aef2981c2a06f693cd6864e828910b21b4b430bb54f4fa497a21a270"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.828854 4651 scope.go:117] "RemoveContainer" containerID="e899c242aef2981c2a06f693cd6864e828910b21b4b430bb54f4fa497a21a270" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.831278 4651 generic.go:334] "Generic (PLEG): container finished" podID="e9981be4-751d-4c74-894a-698adad4c50f" containerID="89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.831324 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerDied","Data":"89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.831634 4651 scope.go:117] "RemoveContainer" containerID="89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.832061 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-cnwcz_openstack-operators(e9981be4-751d-4c74-894a-698adad4c50f)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" podUID="e9981be4-751d-4c74-894a-698adad4c50f" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.834178 4651 generic.go:334] "Generic (PLEG): container finished" podID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" containerID="8e6d638dd8622b109bdc989d4b082b8775a2ab52d359ae2030d59448fa73ddc4" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.834262 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerDied","Data":"8e6d638dd8622b109bdc989d4b082b8775a2ab52d359ae2030d59448fa73ddc4"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.835236 4651 scope.go:117] "RemoveContainer" containerID="8e6d638dd8622b109bdc989d4b082b8775a2ab52d359ae2030d59448fa73ddc4" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.835615 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=operator pod=rabbitmq-cluster-operator-manager-668c99d594-wwjsd_openstack-operators(a72e6d14-1571-4b70-b872-a4a4b0b3c242)\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" podUID="a72e6d14-1571-4b70-b872-a4a4b0b3c242" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.839673 4651 generic.go:334] "Generic (PLEG): container finished" podID="8a55643f-68a5-47ea-8b27-db437d3af215" containerID="3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.840341 4651 scope.go:117] "RemoveContainer" containerID="3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.840584 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_openstack-operators(8a55643f-68a5-47ea-8b27-db437d3af215)\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" podUID="8a55643f-68a5-47ea-8b27-db437d3af215" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.840616 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerDied","Data":"3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.844274 4651 generic.go:334] "Generic (PLEG): container finished" podID="a8e49781-2e0b-476d-be9f-e17f05639447" containerID="52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.844352 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerDied","Data":"52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.844665 4651 scope.go:117] "RemoveContainer" containerID="52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e" Nov 26 15:13:01 crc kubenswrapper[4651]: E1126 15:13:01.844889 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-57988cc5b5-269d2_openstack-operators(a8e49781-2e0b-476d-be9f-e17f05639447)\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" podUID="a8e49781-2e0b-476d-be9f-e17f05639447" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.849757 4651 generic.go:334] "Generic (PLEG): container finished" podID="6b7bc81d-5bbe-4c1b-a512-93e75a1f7035" containerID="9306a8b04c2d0a0d6f9de98d6be8ed6001c3035b9773ed5831a0c8b5c465adf2" exitCode=1 Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.849804 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" event={"ID":"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035","Type":"ContainerDied","Data":"9306a8b04c2d0a0d6f9de98d6be8ed6001c3035b9773ed5831a0c8b5c465adf2"} Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.851000 4651 scope.go:117] "RemoveContainer" containerID="9306a8b04c2d0a0d6f9de98d6be8ed6001c3035b9773ed5831a0c8b5c465adf2" Nov 26 15:13:01 crc kubenswrapper[4651]: I1126 15:13:01.854128 4651 scope.go:117] "RemoveContainer" containerID="7c46a1b0f4f43559adce3d1a67d26ec5eaac5f1152a61a729bcdb293d1a406ab" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.009376 4651 scope.go:117] "RemoveContainer" containerID="254bb7226198214e8b2d36818fe4295bed20fbe71567f6521ae1043594715684" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.113225 4651 scope.go:117] "RemoveContainer" containerID="6a8080f48adc920ab614914aeb492288994ad89d16d849f7fd0e64a5bc233d94" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.146896 4651 scope.go:117] "RemoveContainer" containerID="e4e86fbd4a7f1e3ca3c6445da766092d64defe9efc96d1b66e6e2082bc18e9f7" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.178243 4651 scope.go:117] "RemoveContainer" containerID="0104f495a20a880cdd317a6da9eb4040c59e5ada8c866b537adbb1edf8e011b1" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.204777 4651 scope.go:117] "RemoveContainer" containerID="29d6967888c7f1e1b4e4d2dd10230d66cb68af8b05e13bbbd4da168847ceb8ad" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.236749 4651 scope.go:117] "RemoveContainer" containerID="5f4df4abc8cfc206de00d39d6be18c73e05ea9a7a2a6343dce8f6c26d5729dbf" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.441278 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.441660 4651 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.441681 4651 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="12b5ceb0-b9c6-412e-ab66-35eb5612345d" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.446061 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.492482 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.860409 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" event={"ID":"6b7bc81d-5bbe-4c1b-a512-93e75a1f7035","Type":"ContainerStarted","Data":"6f00a45fafd9f0e9d2c610a094d6ae20454c9559cdc78cf79642fc9ebe83bd3b"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.861058 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.863807 4651 scope.go:117] "RemoveContainer" containerID="c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.864019 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-5b77f656f-pt9q8_openstack-operators(e5c0812c-3183-4f45-b6b9-d4975f8bb80a)\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" podUID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.867178 4651 scope.go:117] "RemoveContainer" containerID="f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.867446 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-76cc84c6bb-8zvlb_openstack-operators(66532d04-3411-4813-ae53-4d635ee98911)\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" podUID="66532d04-3411-4813-ae53-4d635ee98911" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.871356 4651 scope.go:117] "RemoveContainer" containerID="52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.871632 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-57988cc5b5-269d2_openstack-operators(a8e49781-2e0b-476d-be9f-e17f05639447)\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" podUID="a8e49781-2e0b-476d-be9f-e17f05639447" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.875320 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" event={"ID":"674eb001-765e-433a-89d6-2a82fb599a93","Type":"ContainerStarted","Data":"a76d9c3614618b43ae9b99c6811ce8acd8617d6f633a51868d68a8e1c30d46a9"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.875511 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.878230 4651 scope.go:117] "RemoveContainer" containerID="89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.878589 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-cnwcz_openstack-operators(e9981be4-751d-4c74-894a-698adad4c50f)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" podUID="e9981be4-751d-4c74-894a-698adad4c50f" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.881475 4651 generic.go:334] "Generic (PLEG): container finished" podID="e8ad6eac-027c-4615-a5dd-6facdc1db056" containerID="6ae5b12af8c621d86bc62319fd366dd54ebd9fb95be30c8f8337c1f7390221a1" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.881586 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerDied","Data":"6ae5b12af8c621d86bc62319fd366dd54ebd9fb95be30c8f8337c1f7390221a1"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.882713 4651 scope.go:117] "RemoveContainer" containerID="6ae5b12af8c621d86bc62319fd366dd54ebd9fb95be30c8f8337c1f7390221a1" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.897405 4651 generic.go:334] "Generic (PLEG): container finished" podID="5f58ef49-d516-48e5-a508-e4102374d111" containerID="9d1704816bea93211f441f3fb2009a77565f483b2b8e946797c7c3bff6e497fe" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.897452 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerDied","Data":"9d1704816bea93211f441f3fb2009a77565f483b2b8e946797c7c3bff6e497fe"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.898319 4651 scope.go:117] "RemoveContainer" containerID="9d1704816bea93211f441f3fb2009a77565f483b2b8e946797c7c3bff6e497fe" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.900537 4651 scope.go:117] "RemoveContainer" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.900759 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-6b7f75547b-k4tq9_openstack-operators(85fb4e98-47db-403d-85e3-c2550cd47160)\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podUID="85fb4e98-47db-403d-85e3-c2550cd47160" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.903652 4651 generic.go:334] "Generic (PLEG): container finished" podID="53400076-0e4e-4e0b-b476-d4a1fd901631" containerID="5f80678806d74e58b9cc4f523aaaf9b90d456ca8ee88ffdcd603f132a11bbd53" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.903729 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerDied","Data":"5f80678806d74e58b9cc4f523aaaf9b90d456ca8ee88ffdcd603f132a11bbd53"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.903763 4651 scope.go:117] "RemoveContainer" containerID="e899c242aef2981c2a06f693cd6864e828910b21b4b430bb54f4fa497a21a270" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.904085 4651 scope.go:117] "RemoveContainer" containerID="5f80678806d74e58b9cc4f523aaaf9b90d456ca8ee88ffdcd603f132a11bbd53" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.904285 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-5d499bf58b-tszf4_openstack-operators(53400076-0e4e-4e0b-b476-d4a1fd901631)\"" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" podUID="53400076-0e4e-4e0b-b476-d4a1fd901631" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.920861 4651 generic.go:334] "Generic (PLEG): container finished" podID="99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6" containerID="c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.920924 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerDied","Data":"c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.921332 4651 scope.go:117] "RemoveContainer" containerID="c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.921604 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-57548d458d-shslt_openstack-operators(99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6)\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" podUID="99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.928745 4651 generic.go:334] "Generic (PLEG): container finished" podID="6a660fe2-a185-4e56-98cb-b12cdd749964" containerID="19838810e2e215a7b2662e85aec61819aef61f77f3fc2f0bf80f3814a1031970" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.928814 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerDied","Data":"19838810e2e215a7b2662e85aec61819aef61f77f3fc2f0bf80f3814a1031970"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.929394 4651 scope.go:117] "RemoveContainer" containerID="19838810e2e215a7b2662e85aec61819aef61f77f3fc2f0bf80f3814a1031970" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.937444 4651 scope.go:117] "RemoveContainer" containerID="3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012" Nov 26 15:13:02 crc kubenswrapper[4651]: E1126 15:13:02.937742 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_openstack-operators(8a55643f-68a5-47ea-8b27-db437d3af215)\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" podUID="8a55643f-68a5-47ea-8b27-db437d3af215" Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.940170 4651 generic.go:334] "Generic (PLEG): container finished" podID="ce4c06a7-4bcb-4167-bec1-14a45ca24bea" containerID="08f68236bdb9a5f9948976152995692a30fb7b140f874c52e1a8df726eab5227" exitCode=1 Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.940259 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerDied","Data":"08f68236bdb9a5f9948976152995692a30fb7b140f874c52e1a8df726eab5227"} Nov 26 15:13:02 crc kubenswrapper[4651]: I1126 15:13:02.942165 4651 scope.go:117] "RemoveContainer" containerID="08f68236bdb9a5f9948976152995692a30fb7b140f874c52e1a8df726eab5227" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.051966 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.052026 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.088592 4651 scope.go:117] "RemoveContainer" containerID="8836958b25c08ba398d2599e613f6fe57f79fd51839e0bed6314f94ea6d1b99c" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.958759 4651 generic.go:334] "Generic (PLEG): container finished" podID="5f58ef49-d516-48e5-a508-e4102374d111" containerID="254deb1fc7d72a60a6d21c4d94d1447e8296be0ed3618004bc03d15c04f3f52e" exitCode=1 Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.958803 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerDied","Data":"254deb1fc7d72a60a6d21c4d94d1447e8296be0ed3618004bc03d15c04f3f52e"} Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.959173 4651 scope.go:117] "RemoveContainer" containerID="9d1704816bea93211f441f3fb2009a77565f483b2b8e946797c7c3bff6e497fe" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.959768 4651 scope.go:117] "RemoveContainer" containerID="254deb1fc7d72a60a6d21c4d94d1447e8296be0ed3618004bc03d15c04f3f52e" Nov 26 15:13:03 crc kubenswrapper[4651]: E1126 15:13:03.960096 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-955677c94-q8cjf_openstack-operators(5f58ef49-d516-48e5-a508-e4102374d111)\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" podUID="5f58ef49-d516-48e5-a508-e4102374d111" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.968348 4651 scope.go:117] "RemoveContainer" containerID="c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f" Nov 26 15:13:03 crc kubenswrapper[4651]: E1126 15:13:03.968774 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-57548d458d-shslt_openstack-operators(99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6)\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" podUID="99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.973212 4651 generic.go:334] "Generic (PLEG): container finished" podID="ce4c06a7-4bcb-4167-bec1-14a45ca24bea" containerID="11cf52b297be608aadae5d2da899355aebb2aa49c0e147e8cf680857b1b0e84e" exitCode=1 Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.973299 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerDied","Data":"11cf52b297be608aadae5d2da899355aebb2aa49c0e147e8cf680857b1b0e84e"} Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.973974 4651 scope.go:117] "RemoveContainer" containerID="11cf52b297be608aadae5d2da899355aebb2aa49c0e147e8cf680857b1b0e84e" Nov 26 15:13:03 crc kubenswrapper[4651]: E1126 15:13:03.974274 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-56897c768d-k2rdd_openstack-operators(ce4c06a7-4bcb-4167-bec1-14a45ca24bea)\"" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" podUID="ce4c06a7-4bcb-4167-bec1-14a45ca24bea" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.977673 4651 generic.go:334] "Generic (PLEG): container finished" podID="6a660fe2-a185-4e56-98cb-b12cdd749964" containerID="142315b59159b52645d68efe400c216bfc93ba5a82018b216c5a37634dca29fc" exitCode=1 Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.977761 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerDied","Data":"142315b59159b52645d68efe400c216bfc93ba5a82018b216c5a37634dca29fc"} Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.978255 4651 scope.go:117] "RemoveContainer" containerID="142315b59159b52645d68efe400c216bfc93ba5a82018b216c5a37634dca29fc" Nov 26 15:13:03 crc kubenswrapper[4651]: E1126 15:13:03.978604 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-589cbd6b5b-gqj7p_openstack-operators(6a660fe2-a185-4e56-98cb-b12cdd749964)\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" podUID="6a660fe2-a185-4e56-98cb-b12cdd749964" Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.981746 4651 generic.go:334] "Generic (PLEG): container finished" podID="e8ad6eac-027c-4615-a5dd-6facdc1db056" containerID="410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76" exitCode=1 Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.981900 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerDied","Data":"410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76"} Nov 26 15:13:03 crc kubenswrapper[4651]: I1126 15:13:03.983546 4651 scope.go:117] "RemoveContainer" containerID="410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76" Nov 26 15:13:03 crc kubenswrapper[4651]: E1126 15:13:03.984377 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-656dcb59d4-s5dd9_openstack-operators(e8ad6eac-027c-4615-a5dd-6facdc1db056)\"" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.012827 4651 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="21bde69f-6ea1-474c-b58d-226697f8cff1" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.013154 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.013339 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.014277 4651 scope.go:117] "RemoveContainer" containerID="3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73" Nov 26 15:13:04 crc kubenswrapper[4651]: E1126 15:13:04.014778 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-5bcdd9fbc-vsb4g_openstack-operators(e50a607f-7a61-4a78-870a-297fa0daa977)\"" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" podUID="e50a607f-7a61-4a78-870a-297fa0daa977" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.024798 4651 scope.go:117] "RemoveContainer" containerID="08f68236bdb9a5f9948976152995692a30fb7b140f874c52e1a8df726eab5227" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.077148 4651 scope.go:117] "RemoveContainer" containerID="19838810e2e215a7b2662e85aec61819aef61f77f3fc2f0bf80f3814a1031970" Nov 26 15:13:04 crc kubenswrapper[4651]: I1126 15:13:04.134022 4651 scope.go:117] "RemoveContainer" containerID="6ae5b12af8c621d86bc62319fd366dd54ebd9fb95be30c8f8337c1f7390221a1" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.000157 4651 scope.go:117] "RemoveContainer" containerID="3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.000304 4651 scope.go:117] "RemoveContainer" containerID="c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f" Nov 26 15:13:05 crc kubenswrapper[4651]: E1126 15:13:05.000446 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-5bcdd9fbc-vsb4g_openstack-operators(e50a607f-7a61-4a78-870a-297fa0daa977)\"" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" podUID="e50a607f-7a61-4a78-870a-297fa0daa977" Nov 26 15:13:05 crc kubenswrapper[4651]: E1126 15:13:05.000527 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-57548d458d-shslt_openstack-operators(99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6)\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" podUID="99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.362716 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.362778 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.363467 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-state-metrics" containerStatusID={"Type":"cri-o","ID":"4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e"} pod="openstack/kube-state-metrics-0" containerMessage="Container kube-state-metrics failed liveness probe, will be restarted" Nov 26 15:13:05 crc kubenswrapper[4651]: I1126 15:13:05.363512 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerName="kube-state-metrics" containerID="cri-o://4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e" gracePeriod=30 Nov 26 15:13:06 crc kubenswrapper[4651]: I1126 15:13:06.008854 4651 generic.go:334] "Generic (PLEG): container finished" podID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerID="4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e" exitCode=2 Nov 26 15:13:06 crc kubenswrapper[4651]: I1126 15:13:06.009232 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerDied","Data":"4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e"} Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.020083 4651 generic.go:334] "Generic (PLEG): container finished" podID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerID="607050b38072b6ce707df72bc43372a60626df65e9a49880f8f7d1d487708711" exitCode=1 Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.020132 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerDied","Data":"607050b38072b6ce707df72bc43372a60626df65e9a49880f8f7d1d487708711"} Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.020452 4651 scope.go:117] "RemoveContainer" containerID="4c53e8f3c8733df32293257ea5bf33276310b5635496382939e3ba53e1d5b90e" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.021335 4651 scope.go:117] "RemoveContainer" containerID="607050b38072b6ce707df72bc43372a60626df65e9a49880f8f7d1d487708711" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.213088 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.213152 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.213830 4651 scope.go:117] "RemoveContainer" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.214194 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-6b7f75547b-k4tq9_openstack-operators(85fb4e98-47db-403d-85e3-c2550cd47160)\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podUID="85fb4e98-47db-403d-85e3-c2550cd47160" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.227261 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.227351 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.228142 4651 scope.go:117] "RemoveContainer" containerID="e5acf425e9f900ca037b72b89b3cc4f5a49f57d2d78af4fd208599e8eaf827b5" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.228381 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-7b64f4fb85-5jb5x_openstack-operators(ec10af15-dcf5-413d-87ef-0ca5a469b5fa)\"" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" podUID="ec10af15-dcf5-413d-87ef-0ca5a469b5fa" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.326491 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.326561 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.327361 4651 scope.go:117] "RemoveContainer" containerID="254deb1fc7d72a60a6d21c4d94d1447e8296be0ed3618004bc03d15c04f3f52e" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.327612 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-955677c94-q8cjf_openstack-operators(5f58ef49-d516-48e5-a508-e4102374d111)\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" podUID="5f58ef49-d516-48e5-a508-e4102374d111" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.333823 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.333874 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.334602 4651 scope.go:117] "RemoveContainer" containerID="142315b59159b52645d68efe400c216bfc93ba5a82018b216c5a37634dca29fc" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.334939 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-589cbd6b5b-gqj7p_openstack-operators(6a660fe2-a185-4e56-98cb-b12cdd749964)\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" podUID="6a660fe2-a185-4e56-98cb-b12cdd749964" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.440839 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.440876 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.441466 4651 scope.go:117] "RemoveContainer" containerID="cce1cfc1869d9794a36a4e425555a8e364f6f3dc1b078a62e6480b16a2ca5ec2" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.441674 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-5d494799bf-v89cv_openstack-operators(eed373f0-add9-4ae8-b5cc-ed711e79b5c5)\"" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" podUID="eed373f0-add9-4ae8-b5cc-ed711e79b5c5" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.463538 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.463685 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.464418 4651 scope.go:117] "RemoveContainer" containerID="0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.464797 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7b4567c7cf-hmndm_openstack-operators(dc5a51cf-b992-4542-8b00-2948ab513eed)\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.485392 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.486178 4651 scope.go:117] "RemoveContainer" containerID="8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.555493 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.555795 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.556763 4651 scope.go:117] "RemoveContainer" containerID="5f80678806d74e58b9cc4f523aaaf9b90d456ca8ee88ffdcd603f132a11bbd53" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.557182 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-5d499bf58b-tszf4_openstack-operators(53400076-0e4e-4e0b-b476-d4a1fd901631)\"" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" podUID="53400076-0e4e-4e0b-b476-d4a1fd901631" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.639870 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.640817 4651 scope.go:117] "RemoveContainer" containerID="3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.641218 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_openstack-operators(8a55643f-68a5-47ea-8b27-db437d3af215)\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" podUID="8a55643f-68a5-47ea-8b27-db437d3af215" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.694246 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.694714 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.694765 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.694994 4651 scope.go:117] "RemoveContainer" containerID="c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.695291 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-5b77f656f-pt9q8_openstack-operators(e5c0812c-3183-4f45-b6b9-d4975f8bb80a)\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" podUID="e5c0812c-3183-4f45-b6b9-d4975f8bb80a" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.695391 4651 scope.go:117] "RemoveContainer" containerID="ae67209f929776e077fd695c2b58f547efadc1914e7822bd684b5c862e5df403" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.695622 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-6fdcddb789-8h624_openstack-operators(8271ec0d-f8ea-4c46-984f-95572691a379)\"" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" podUID="8271ec0d-f8ea-4c46-984f-95572691a379" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.744329 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.745067 4651 scope.go:117] "RemoveContainer" containerID="89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.745289 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-cnwcz_openstack-operators(e9981be4-751d-4c74-894a-698adad4c50f)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" podUID="e9981be4-751d-4c74-894a-698adad4c50f" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.948480 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.948535 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:13:07 crc kubenswrapper[4651]: I1126 15:13:07.949229 4651 scope.go:117] "RemoveContainer" containerID="11cf52b297be608aadae5d2da899355aebb2aa49c0e147e8cf680857b1b0e84e" Nov 26 15:13:07 crc kubenswrapper[4651]: E1126 15:13:07.949565 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-56897c768d-k2rdd_openstack-operators(ce4c06a7-4bcb-4167-bec1-14a45ca24bea)\"" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" podUID="ce4c06a7-4bcb-4167-bec1-14a45ca24bea" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.029219 4651 generic.go:334] "Generic (PLEG): container finished" podID="14110a58-3dd5-4827-8a86-d4c0fc377b97" containerID="1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58" exitCode=1 Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.029315 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerDied","Data":"1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58"} Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.029362 4651 scope.go:117] "RemoveContainer" containerID="8094d95eeaf473de01c22d35983ec9c10b42c0c887ef3a53c8e159a476b692a1" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.030012 4651 scope.go:117] "RemoveContainer" containerID="1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.030349 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97)\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.031411 4651 generic.go:334] "Generic (PLEG): container finished" podID="8c2d03fc-6edd-4654-8116-99aae88e3fab" containerID="109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a" exitCode=1 Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.031626 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerDied","Data":"109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a"} Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.031796 4651 scope.go:117] "RemoveContainer" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.032014 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-6b7f75547b-k4tq9_openstack-operators(85fb4e98-47db-403d-85e3-c2550cd47160)\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" podUID="85fb4e98-47db-403d-85e3-c2550cd47160" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.032285 4651 scope.go:117] "RemoveContainer" containerID="109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.032385 4651 scope.go:117] "RemoveContainer" containerID="0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.032548 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(8c2d03fc-6edd-4654-8116-99aae88e3fab)\"" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.032586 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7b4567c7cf-hmndm_openstack-operators(dc5a51cf-b992-4542-8b00-2948ab513eed)\"" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" podUID="dc5a51cf-b992-4542-8b00-2948ab513eed" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.078552 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.079684 4651 scope.go:117] "RemoveContainer" containerID="52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.079954 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-57988cc5b5-269d2_openstack-operators(a8e49781-2e0b-476d-be9f-e17f05639447)\"" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" podUID="a8e49781-2e0b-476d-be9f-e17f05639447" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.094717 4651 scope.go:117] "RemoveContainer" containerID="607050b38072b6ce707df72bc43372a60626df65e9a49880f8f7d1d487708711" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.180254 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.180298 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.180619 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.180738 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.181016 4651 scope.go:117] "RemoveContainer" containerID="525c1ddb2a0b10ca086856e3c06fcf68d98b0ea9974fdb788725f75bb6ab1d01" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.181284 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=octavia-operator-controller-manager-64cdc6ff96-x9mdd_openstack-operators(b24122be-246e-4dc9-a3ad-4ca2392a4660)\"" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" podUID="b24122be-246e-4dc9-a3ad-4ca2392a4660" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.181384 4651 scope.go:117] "RemoveContainer" containerID="716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.181692 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb)\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.351493 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-l2z9w" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.365915 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.366887 4651 scope.go:117] "RemoveContainer" containerID="f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.367292 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-76cc84c6bb-8zvlb_openstack-operators(66532d04-3411-4813-ae53-4d635ee98911)\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" podUID="66532d04-3411-4813-ae53-4d635ee98911" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.402484 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.402583 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:13:08 crc kubenswrapper[4651]: I1126 15:13:08.403276 4651 scope.go:117] "RemoveContainer" containerID="410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76" Nov 26 15:13:08 crc kubenswrapper[4651]: E1126 15:13:08.403554 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-656dcb59d4-s5dd9_openstack-operators(e8ad6eac-027c-4615-a5dd-6facdc1db056)\"" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.045424 4651 scope.go:117] "RemoveContainer" containerID="410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76" Nov 26 15:13:09 crc kubenswrapper[4651]: E1126 15:13:09.045774 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-656dcb59d4-s5dd9_openstack-operators(e8ad6eac-027c-4615-a5dd-6facdc1db056)\"" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" podUID="e8ad6eac-027c-4615-a5dd-6facdc1db056" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.046000 4651 scope.go:117] "RemoveContainer" containerID="716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.046298 4651 scope.go:117] "RemoveContainer" containerID="109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a" Nov 26 15:13:09 crc kubenswrapper[4651]: E1126 15:13:09.046321 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-d77b94747-6kjgs_openstack-operators(719afb5d-40c4-4fa3-b030-38c170fc7dbb)\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" podUID="719afb5d-40c4-4fa3-b030-38c170fc7dbb" Nov 26 15:13:09 crc kubenswrapper[4651]: E1126 15:13:09.046597 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(8c2d03fc-6edd-4654-8116-99aae88e3fab)\"" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.763317 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.974716 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9" Nov 26 15:13:09 crc kubenswrapper[4651]: I1126 15:13:09.993443 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.402764 4651 scope.go:117] "RemoveContainer" containerID="2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.406168 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.680703 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6b4f979c6c-lg95c" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.741763 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rdrk6" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.840525 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 15:13:10 crc kubenswrapper[4651]: I1126 15:13:10.943109 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.067807 4651 generic.go:334] "Generic (PLEG): container finished" podID="f688796e-89d5-4da8-8dc7-786c5940b853" containerID="24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133" exitCode=1 Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.067869 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerDied","Data":"24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133"} Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.067909 4651 scope.go:117] "RemoveContainer" containerID="2e1a7fb8fe4a747df0ec560d0f9d356b553d8e69b84e844d7ff5ed2b0bd0c6ea" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.068978 4651 scope.go:117] "RemoveContainer" containerID="24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133" Nov 26 15:13:11 crc kubenswrapper[4651]: E1126 15:13:11.070859 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-5b5d786cf6-wsrgh_metallb-system(f688796e-89d5-4da8-8dc7-786c5940b853)\"" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.396682 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.462012 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.475993 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.727129 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.739219 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.901882 4651 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 15:13:11 crc kubenswrapper[4651]: I1126 15:13:11.978407 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.071168 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.160253 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.200433 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-rvhpg" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.217838 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.240835 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.310636 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.329537 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.383823 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.570790 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.580575 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.618492 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.653712 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.660974 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.690646 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gdbpc" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.712317 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.790841 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.823241 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.888142 4651 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.900576 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.900635 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.905204 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.924578 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.924558321 podStartE2EDuration="12.924558321s" podCreationTimestamp="2025-11-26 15:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:13:12.92454189 +0000 UTC m=+1360.350289514" watchObservedRunningTime="2025-11-26 15:13:12.924558321 +0000 UTC m=+1360.350305935" Nov 26 15:13:12 crc kubenswrapper[4651]: I1126 15:13:12.955699 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.051632 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.081684 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.133882 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gsfrd" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.237097 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.256770 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.352099 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.431122 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-f4vf8" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.521926 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.525502 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.654066 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.734397 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gnqpg" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.785301 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.804725 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.886271 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.927096 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.970395 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 15:13:13 crc kubenswrapper[4651]: I1126 15:13:13.982833 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.158843 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.177653 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.275805 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.313645 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.314812 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.322308 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.327286 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.361865 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.363722 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.368424 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t7x8w" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.368573 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jw7t8" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.373012 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.378111 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.395693 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.412816 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.456236 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.457903 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.484519 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.493072 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q7r49" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.536688 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.554446 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-p822z" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.558822 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.569324 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-db2f4" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.573475 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.580763 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bgfxc" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.595877 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.624338 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.659293 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.664742 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6r9kk" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.727825 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.738552 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.748681 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.750127 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.764591 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.790437 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.792302 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.893997 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.953319 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.956224 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g2k9k" Nov 26 15:13:14 crc kubenswrapper[4651]: I1126 15:13:14.970109 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.021086 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.053543 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.063403 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.079191 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.217742 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.251599 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.294784 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.336801 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.355347 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.355420 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.356292 4651 scope.go:117] "RemoveContainer" containerID="109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a" Nov 26 15:13:15 crc kubenswrapper[4651]: E1126 15:13:15.357361 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(8c2d03fc-6edd-4654-8116-99aae88e3fab)\"" pod="openstack/kube-state-metrics-0" podUID="8c2d03fc-6edd-4654-8116-99aae88e3fab" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.364076 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.367492 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.430243 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.442086 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.453938 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.551127 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.591797 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.609400 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.617746 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.632108 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.660165 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.681320 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.691589 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.778881 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.873652 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.921587 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.944900 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.960187 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 15:13:15 crc kubenswrapper[4651]: I1126 15:13:15.996132 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hn64j" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.013266 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.031960 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.037987 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.086990 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sk4zf" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.125772 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.161256 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.178216 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.242673 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.244500 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.267125 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-62rlj" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.301140 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.354019 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k89xc" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.362558 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.379120 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.379972 4651 scope.go:117] "RemoveContainer" containerID="24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133" Nov 26 15:13:16 crc kubenswrapper[4651]: E1126 15:13:16.380248 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-5b5d786cf6-wsrgh_metallb-system(f688796e-89d5-4da8-8dc7-786c5940b853)\"" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.401866 4651 scope.go:117] "RemoveContainer" containerID="8e6d638dd8622b109bdc989d4b082b8775a2ab52d359ae2030d59448fa73ddc4" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.405765 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.424591 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.453329 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.466754 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.476316 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.502189 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.505396 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.529082 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sdtmc" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.540254 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.542399 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.593400 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.680682 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.723445 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.744985 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.763452 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hsft9" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.794688 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.874907 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.912434 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hcd7m" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.925616 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 15:13:16 crc kubenswrapper[4651]: I1126 15:13:16.978639 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.040362 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.049940 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.084716 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.100114 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.111648 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.130792 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wwjsd" event={"ID":"a72e6d14-1571-4b70-b872-a4a4b0b3c242","Type":"ContainerStarted","Data":"be349ab0c7ce5ebc3675aa6f3c3228016aaddd3af22016b2184f7d65da9b4c22"} Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.194970 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.217936 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.283237 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.318942 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.381356 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-bkc2d" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.405404 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.447283 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.462723 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.485135 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.485905 4651 scope.go:117] "RemoveContainer" containerID="1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58" Nov 26 15:13:17 crc kubenswrapper[4651]: E1126 15:13:17.486182 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97)\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.504819 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.526385 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.616381 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.619825 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.622970 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.624956 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.668777 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.701520 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.750424 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.803002 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vnccs" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.824910 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xtz7s" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.892640 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.908701 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.969555 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 26 15:13:17 crc kubenswrapper[4651]: I1126 15:13:17.984796 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.004414 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.049301 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.058616 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.099331 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.129965 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.158132 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.196332 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.204904 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.210421 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.275610 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.286500 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.286530 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.290769 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.296759 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.380373 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.385584 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.386013 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.402423 4651 scope.go:117] "RemoveContainer" containerID="c044390f166cbf04a3406f934d36dee401d396c7c23b4f2b5a5b08e920cfc86f" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.403239 4651 scope.go:117] "RemoveContainer" containerID="e5acf425e9f900ca037b72b89b3cc4f5a49f57d2d78af4fd208599e8eaf827b5" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.414393 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.443570 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.474806 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.499504 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.570828 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.577153 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.600577 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.604310 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.617944 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.638398 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.643734 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.677304 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.689268 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.695437 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.724336 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.727876 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.734107 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.765923 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.774503 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.793219 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.863973 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.865313 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.891849 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.956475 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.963506 4651 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.987426 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 15:13:18 crc kubenswrapper[4651]: I1126 15:13:18.996240 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.027691 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.042162 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.064727 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bwqr5" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.082552 4651 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.167778 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" event={"ID":"ec10af15-dcf5-413d-87ef-0ca5a469b5fa","Type":"ContainerStarted","Data":"05e704f8008b51d00c2ddc4edade15e7fd153dae4fbfb53da64b895caa7f7dc2"} Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.168211 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.172243 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" event={"ID":"99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6","Type":"ContainerStarted","Data":"3d6ea600b48a165ad5972cc62e845c504f4ad6b632418d024fd5ae708e230a20"} Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.172549 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.175722 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.205681 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.254208 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.291177 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.317882 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.372158 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.382301 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jwdhh" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.388355 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.402344 4651 scope.go:117] "RemoveContainer" containerID="d945781061fd3e783cfa06cb5b7f62620dd928883b1e5cdc2160afd8576ba71d" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.402893 4651 scope.go:117] "RemoveContainer" containerID="11cf52b297be608aadae5d2da899355aebb2aa49c0e147e8cf680857b1b0e84e" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.403779 4651 scope.go:117] "RemoveContainer" containerID="3f4740a4ce4122a4d499010e74fdc53d29f2871babb37d597d7d0ee9f15a0c73" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.415312 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.445018 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.482947 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pnk7g" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.502684 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.602727 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dkbnm" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.674969 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.710629 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.776178 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.814381 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.847782 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gk4z8" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.875100 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.905244 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.908587 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.918812 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 15:13:19 crc kubenswrapper[4651]: I1126 15:13:19.920882 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.008062 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.025384 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.080003 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.106342 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.144147 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.171308 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.172079 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.180879 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.183314 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" event={"ID":"e50a607f-7a61-4a78-870a-297fa0daa977","Type":"ContainerStarted","Data":"3f8e997a24ba57d10779c6e731fe2c8e9ba6e8644f1932252751a61e7d3a4073"} Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.183585 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.185931 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" event={"ID":"85fb4e98-47db-403d-85e3-c2550cd47160","Type":"ContainerStarted","Data":"9eea2724b8f6e9de32d8a83bf0b984bf2a6be48b2c12bc8401cc43b6d373ee84"} Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.186204 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.188274 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" event={"ID":"ce4c06a7-4bcb-4167-bec1-14a45ca24bea","Type":"ContainerStarted","Data":"f98a161b8c77a783c6563ff2c63b139bec0acb4a8e215ce33094071a81bafc65"} Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.188414 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.197323 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.206927 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.309350 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.334730 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.342356 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.366971 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nwqlt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.372894 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.402530 4651 scope.go:117] "RemoveContainer" containerID="142315b59159b52645d68efe400c216bfc93ba5a82018b216c5a37634dca29fc" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.402776 4651 scope.go:117] "RemoveContainer" containerID="0c597ff292f2433a636abd52aefbd6d01f3c77267941ef1aee522c71d06f77d9" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.402860 4651 scope.go:117] "RemoveContainer" containerID="3cedf3ccae7cba75bc1b6b2ea825193f2bebd174ad51fedeb50c8e2455105012" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.446870 4651 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.603938 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.626124 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.643401 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.647713 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.659541 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.675522 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2d62w" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.685060 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.738907 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.741251 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.858805 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.901105 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.904984 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zdppk" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.921791 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-x92qn" Nov 26 15:13:20 crc kubenswrapper[4651]: I1126 15:13:20.936619 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.008337 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.030994 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.034163 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pt6md" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.202977 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" event={"ID":"6a660fe2-a185-4e56-98cb-b12cdd749964","Type":"ContainerStarted","Data":"dc49612fa65fc7db85629184862e3e7fd0c9d75ee1ee7d0be2ba93b3c1e38020"} Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.204254 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.208599 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.208844 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" event={"ID":"dc5a51cf-b992-4542-8b00-2948ab513eed","Type":"ContainerStarted","Data":"afda9190f44f1179ae5a452a09be143747fb89b6139ab5b34d084954a302392e"} Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.209104 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.216783 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" event={"ID":"8a55643f-68a5-47ea-8b27-db437d3af215","Type":"ContainerStarted","Data":"39ff24d891ba57915a77f6624d3f50d8b123bd8fdc15a37b7e0bf61bba50d1c8"} Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.279547 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.350544 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.368441 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.402330 4651 scope.go:117] "RemoveContainer" containerID="254deb1fc7d72a60a6d21c4d94d1447e8296be0ed3618004bc03d15c04f3f52e" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.402409 4651 scope.go:117] "RemoveContainer" containerID="c23ab9b50a1b00edef74924d29298407589e5e2bf5947109db3caa8896f8e645" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.402989 4651 scope.go:117] "RemoveContainer" containerID="cce1cfc1869d9794a36a4e425555a8e364f6f3dc1b078a62e6480b16a2ca5ec2" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.426700 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.427131 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.434094 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2hmsq" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.453783 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.479836 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.493276 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.584495 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.595520 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-n9s69" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.625059 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.628470 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.634176 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.646284 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.649016 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.679969 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.691568 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.716084 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.744305 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rdhrz" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.879201 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.916027 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.938610 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.947320 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.950308 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.967751 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 15:13:21 crc kubenswrapper[4651]: I1126 15:13:21.982277 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.082030 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.135598 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.139934 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.229118 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" event={"ID":"eed373f0-add9-4ae8-b5cc-ed711e79b5c5","Type":"ContainerStarted","Data":"fa6394ac1250aa7506556d1537dd9f45edb3adee69344aab9bc6ab64bbade881"} Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.229756 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.232030 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" event={"ID":"5f58ef49-d516-48e5-a508-e4102374d111","Type":"ContainerStarted","Data":"24c2a179836eb36db9b32518e4ce916d93aee4da183e7c9233c91e9126b757c1"} Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.233067 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.235603 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" event={"ID":"e5c0812c-3183-4f45-b6b9-d4975f8bb80a","Type":"ContainerStarted","Data":"fa874c6af6a73a4bb1f7f6cf99bd488c2ac5dd83f82a227fb02d76c20c6e5fb2"} Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.237719 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.300242 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.336228 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.402740 4651 scope.go:117] "RemoveContainer" containerID="5f80678806d74e58b9cc4f523aaaf9b90d456ca8ee88ffdcd603f132a11bbd53" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.403355 4651 scope.go:117] "RemoveContainer" containerID="52461601bfcd5ab86a930fcaa0701b10b0c34bfe00b1af8757a116f097dfc44e" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.403433 4651 scope.go:117] "RemoveContainer" containerID="ae67209f929776e077fd695c2b58f547efadc1914e7822bd684b5c862e5df403" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.404535 4651 scope.go:117] "RemoveContainer" containerID="525c1ddb2a0b10ca086856e3c06fcf68d98b0ea9974fdb788725f75bb6ab1d01" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.406109 4651 scope.go:117] "RemoveContainer" containerID="410c3068e477b7b02d24d9247fb5e9710d1c993afbbb8fee84a2f57cb6ec2a76" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.499259 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.559374 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.602561 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-qqpd8" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.609566 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.611901 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.683626 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.690506 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.693666 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.717242 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.757978 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.783669 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.830444 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.842398 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d4kwd" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.859654 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.869614 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2sdkq" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.917721 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 15:13:22 crc kubenswrapper[4651]: I1126 15:13:22.985261 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.010686 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.034136 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.044435 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.047832 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.054946 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.057747 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-shslt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.081430 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.113171 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pv895" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.117439 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.150325 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.152245 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.169250 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r9nfj" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.169455 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.172245 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.182317 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.222167 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.247060 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.248705 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" event={"ID":"e8ad6eac-027c-4615-a5dd-6facdc1db056","Type":"ContainerStarted","Data":"5fafaf04a740b61798a9b277ae117dc17063f076244216afe915ff4dcd1763f5"} Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.248981 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.251880 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" event={"ID":"8271ec0d-f8ea-4c46-984f-95572691a379","Type":"ContainerStarted","Data":"c89c90fc067f5979f3349b409536d42eafb5f10e7b3b1f457b8ad24dfaa934f7"} Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.252413 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.253698 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" event={"ID":"b24122be-246e-4dc9-a3ad-4ca2392a4660","Type":"ContainerStarted","Data":"f4ac37fe71f4391eb4070a92aab85da566d4f1fb7f47ba8d3d66c707412fa153"} Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.255263 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.265841 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" event={"ID":"53400076-0e4e-4e0b-b476-d4a1fd901631","Type":"ContainerStarted","Data":"8eb292ebd3d9ae881cfd7bf52a197514b32198d741ccaae4e5dda6400364588f"} Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.266264 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.269235 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" event={"ID":"a8e49781-2e0b-476d-be9f-e17f05639447","Type":"ContainerStarted","Data":"aca585167dc204c26bd62b0eecaf6632e31739e018d5363e03ae20ac571930e4"} Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.296015 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.344054 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.352510 4651 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.352744 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9" gracePeriod=5 Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.401799 4651 scope.go:117] "RemoveContainer" containerID="f0f93e86575df7ab4752e098b888c588e3c053096fe9e7d6b159a3cd5b591051" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.402984 4651 scope.go:117] "RemoveContainer" containerID="716508fb3ef35f2dbd7cbdb035a325a4470da4b25ea778ec04b6babf810901b9" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.414094 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.421549 4651 scope.go:117] "RemoveContainer" containerID="89a423091fbe525761394ca14b0216f9d5e770c147f667c25cd0f0e08bd4c69e" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.447405 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.615371 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8555n" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.634710 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cm4sk" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.684993 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t2vhh" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.690999 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.722595 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.757574 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.765696 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s27gg" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.799216 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.825153 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.909000 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.924351 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bh8br" Nov 26 15:13:23 crc kubenswrapper[4651]: I1126 15:13:23.982794 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.004175 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.006266 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.025853 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5bcdd9fbc-vsb4g" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.087151 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.127617 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.138879 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.144461 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.202003 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4tjkf" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.202232 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.218861 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.249395 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.288799 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" event={"ID":"e9981be4-751d-4c74-894a-698adad4c50f","Type":"ContainerStarted","Data":"7e427563e3b5821f01da97bb93211be550b30eb9450064cb5d866c7317e9f301"} Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.290111 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.294158 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" event={"ID":"66532d04-3411-4813-ae53-4d635ee98911","Type":"ContainerStarted","Data":"5b0923472711ce3d8d98dd93ee13dead91597414b8cedbe287e053e78e724562"} Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.294475 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.296327 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" event={"ID":"719afb5d-40c4-4fa3-b030-38c170fc7dbb","Type":"ContainerStarted","Data":"458f7d97fedf7530435ed345fff4b67bf5262e56ff4512981136d00feaccb599"} Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.362007 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.374238 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.449866 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.472025 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.538010 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nj5mh" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.567528 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.588686 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.588721 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.623687 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.632290 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.635710 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.737772 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.843543 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.882238 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.954132 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.954560 4651 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9p5j9" Nov 26 15:13:24 crc kubenswrapper[4651]: I1126 15:13:24.959090 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.103699 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.106427 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-jdjct" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.160202 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mdhqs" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.419470 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.461966 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.466412 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.636961 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.733248 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.787309 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.933209 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-89fgx" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.966091 4651 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 15:13:25 crc kubenswrapper[4651]: I1126 15:13:25.990291 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wv4vg" Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.236566 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.515076 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.521892 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.523415 4651 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.658986 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 15:13:26 crc kubenswrapper[4651]: I1126 15:13:26.890152 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.215459 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-k4tq9" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.229427 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-5jb5x" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.328622 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-q8cjf" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.336559 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-gqj7p" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.444665 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-v89cv" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.478106 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-hmndm" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.484936 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.485491 4651 scope.go:117] "RemoveContainer" containerID="1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58" Nov 26 15:13:27 crc kubenswrapper[4651]: E1126 15:13:27.485752 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=ironic-operator-controller-manager-67cb4dc6d4-cggjs_openstack-operators(14110a58-3dd5-4827-8a86-d4c0fc377b97)\"" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" podUID="14110a58-3dd5-4827-8a86-d4c0fc377b97" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.557290 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-tszf4" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.640359 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.642637 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-ffbs5" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.694672 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.698567 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-pt9q8" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.700068 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-8h624" Nov 26 15:13:27 crc kubenswrapper[4651]: I1126 15:13:27.951908 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-k2rdd" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.079301 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.081858 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-269d2" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.180963 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.184302 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-6kjgs" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.186142 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-x9mdd" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.379522 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8zvlb" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.401733 4651 scope.go:117] "RemoveContainer" containerID="24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133" Nov 26 15:13:28 crc kubenswrapper[4651]: E1126 15:13:28.402171 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-5b5d786cf6-wsrgh_metallb-system(f688796e-89d5-4da8-8dc7-786c5940b853)\"" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" podUID="f688796e-89d5-4da8-8dc7-786c5940b853" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.402760 4651 scope.go:117] "RemoveContainer" containerID="109070003e847d87244ff2be13ea3b959a6ad3c8915869fc214ce9a97b91f49a" Nov 26 15:13:28 crc kubenswrapper[4651]: I1126 15:13:28.408873 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-s5dd9" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.028354 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.028644 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143097 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143152 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143258 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143318 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143393 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143694 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143729 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143776 4651 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143789 4651 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143842 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.143864 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.159443 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.245214 4651 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.245422 4651 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.245434 4651 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.346759 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.346818 4651 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9" exitCode=137 Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.346902 4651 scope.go:117] "RemoveContainer" containerID="c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.347071 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.352172 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8c2d03fc-6edd-4654-8116-99aae88e3fab","Type":"ContainerStarted","Data":"1076b8f0f5a5583973f7e7706f67f70772b90c26fa5c9695fc7497b1916b565c"} Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.353855 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.391671 4651 scope.go:117] "RemoveContainer" containerID="c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9" Nov 26 15:13:29 crc kubenswrapper[4651]: E1126 15:13:29.393301 4651 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9\": container with ID starting with c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9 not found: ID does not exist" containerID="c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.393347 4651 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9"} err="failed to get container status \"c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9\": rpc error: code = NotFound desc = could not find container \"c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9\": container with ID starting with c64f8d8d8d28e8e119815ae3bae3c99f01ff36ac73ef8915c81013846f3d41f9 not found: ID does not exist" Nov 26 15:13:29 crc kubenswrapper[4651]: I1126 15:13:29.428121 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 26 15:13:35 crc kubenswrapper[4651]: I1126 15:13:35.365919 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 26 15:13:37 crc kubenswrapper[4651]: I1126 15:13:37.494856 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 15:13:37 crc kubenswrapper[4651]: I1126 15:13:37.747839 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-cnwcz" Nov 26 15:13:37 crc kubenswrapper[4651]: I1126 15:13:37.793543 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 15:13:39 crc kubenswrapper[4651]: I1126 15:13:39.402836 4651 scope.go:117] "RemoveContainer" containerID="24bdc57da7ffb181212009242952cc98ef650f978813c50763acdd275e861133" Nov 26 15:13:40 crc kubenswrapper[4651]: I1126 15:13:40.027171 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 15:13:40 crc kubenswrapper[4651]: I1126 15:13:40.401911 4651 scope.go:117] "RemoveContainer" containerID="1a2f50227b1cbedb79671ac0548046a71e22693c3eb63f8c57463adf30404c58" Nov 26 15:13:40 crc kubenswrapper[4651]: I1126 15:13:40.472227 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" event={"ID":"f688796e-89d5-4da8-8dc7-786c5940b853","Type":"ContainerStarted","Data":"9cca9adb33dfcc0cca15bc72ae562b48e4277fd9b47d0a798ca72e0e058596bf"} Nov 26 15:13:40 crc kubenswrapper[4651]: I1126 15:13:40.472430 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:13:41 crc kubenswrapper[4651]: I1126 15:13:41.182652 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lb8mv" Nov 26 15:13:41 crc kubenswrapper[4651]: I1126 15:13:41.482592 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" event={"ID":"14110a58-3dd5-4827-8a86-d4c0fc377b97","Type":"ContainerStarted","Data":"1152526a8c34a6e8590bd97c259d67b894169eb79201de0a92996feac573d69b"} Nov 26 15:13:41 crc kubenswrapper[4651]: I1126 15:13:41.483151 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:13:41 crc kubenswrapper[4651]: I1126 15:13:41.574669 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 15:13:42 crc kubenswrapper[4651]: I1126 15:13:42.450138 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 26 15:13:42 crc kubenswrapper[4651]: I1126 15:13:42.597419 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 26 15:13:43 crc kubenswrapper[4651]: I1126 15:13:43.086848 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-msrsw" Nov 26 15:13:43 crc kubenswrapper[4651]: I1126 15:13:43.182211 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 26 15:13:44 crc kubenswrapper[4651]: I1126 15:13:44.006145 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 15:13:45 crc kubenswrapper[4651]: I1126 15:13:45.441677 4651 scope.go:117] "RemoveContainer" containerID="1edbe76f7461d0e15bb50e3d25c46101540fe7fa380949172e2d3e161d96af11" Nov 26 15:13:45 crc kubenswrapper[4651]: I1126 15:13:45.462730 4651 scope.go:117] "RemoveContainer" containerID="fd6292db0e0169b5a21270062d4dc5b40619437121e3ab781cd36981af2b0a5e" Nov 26 15:13:45 crc kubenswrapper[4651]: I1126 15:13:45.498305 4651 scope.go:117] "RemoveContainer" containerID="b18db86a93db98c43ae81c9e05be8dec3a64dcb4461b4f2f64d198b8779af2c0" Nov 26 15:13:45 crc kubenswrapper[4651]: I1126 15:13:45.547436 4651 scope.go:117] "RemoveContainer" containerID="9ec0d32784ed5ced6da55fc86d918723f8ec5b5e23f395de2b4b6ddd05c4482a" Nov 26 15:13:47 crc kubenswrapper[4651]: I1126 15:13:47.487528 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-cggjs" Nov 26 15:13:47 crc kubenswrapper[4651]: I1126 15:13:47.670840 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 26 15:13:48 crc kubenswrapper[4651]: I1126 15:13:48.882245 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 15:13:52 crc kubenswrapper[4651]: I1126 15:13:52.331672 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 15:13:52 crc kubenswrapper[4651]: I1126 15:13:52.628210 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.137148 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.762707 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.917912 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdvq7/must-gather-p8cln"] Nov 26 15:13:54 crc kubenswrapper[4651]: E1126 15:13:54.918988 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.919091 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:13:54 crc kubenswrapper[4651]: E1126 15:13:54.919165 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" containerName="installer" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.919229 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" containerName="installer" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.919580 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bbb04d-cfcf-46bd-9f9a-2700a64b37c2" containerName="installer" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.919678 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.920753 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.930059 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kdvq7"/"openshift-service-ca.crt" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.930435 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kdvq7"/"kube-root-ca.crt" Nov 26 15:13:54 crc kubenswrapper[4651]: I1126 15:13:54.972567 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdvq7/must-gather-p8cln"] Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.062593 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.062818 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lrp\" (UniqueName: \"kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.139506 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.145323 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.164268 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lrp\" (UniqueName: \"kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.164324 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.164745 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.166210 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.247430 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lrp\" (UniqueName: \"kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp\") pod \"must-gather-p8cln\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.265439 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.265533 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf4d4\" (UniqueName: \"kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.265605 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.308814 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.310654 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.355690 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.367277 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.367353 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.367424 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf4d4\" (UniqueName: \"kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.368516 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.368726 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.424128 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf4d4\" (UniqueName: \"kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4\") pod \"community-operators-rjjbw\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.467576 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.469818 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.469945 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.470050 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frb6v\" (UniqueName: \"kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.536740 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.571472 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.571890 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frb6v\" (UniqueName: \"kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.572072 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.572587 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.574101 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.605256 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frb6v\" (UniqueName: \"kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v\") pod \"certified-operators-t8jb5\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:55 crc kubenswrapper[4651]: I1126 15:13:55.629159 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.397894 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:13:56 crc kubenswrapper[4651]: W1126 15:13:56.485506 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e09c768_ab48_443b_92fd_e14e83ad5d9e.slice/crio-249c04bf505cbc0003826f5faccd13688e8e8ebfaec9d73dd89e1963bd110c62 WatchSource:0}: Error finding container 249c04bf505cbc0003826f5faccd13688e8e8ebfaec9d73dd89e1963bd110c62: Status 404 returned error can't find the container with id 249c04bf505cbc0003826f5faccd13688e8e8ebfaec9d73dd89e1963bd110c62 Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.666484 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.683007 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerStarted","Data":"249c04bf505cbc0003826f5faccd13688e8e8ebfaec9d73dd89e1963bd110c62"} Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.727606 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:13:56 crc kubenswrapper[4651]: W1126 15:13:56.769666 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4185f823_c76c_46a8_86bb_13901074765f.slice/crio-4e2fd861c05dc829291c98b5d8b1bcb60b0529b2a9143946350ab64d27d0fedb WatchSource:0}: Error finding container 4e2fd861c05dc829291c98b5d8b1bcb60b0529b2a9143946350ab64d27d0fedb: Status 404 returned error can't find the container with id 4e2fd861c05dc829291c98b5d8b1bcb60b0529b2a9143946350ab64d27d0fedb Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.877527 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdvq7/must-gather-p8cln"] Nov 26 15:13:56 crc kubenswrapper[4651]: I1126 15:13:56.982114 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.752007 4651 generic.go:334] "Generic (PLEG): container finished" podID="4185f823-c76c-46a8-86bb-13901074765f" containerID="8e80dfb5765d248afe674a27c2d4176f44b0550b1040d1fe485099d3469b0483" exitCode=0 Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.753705 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerDied","Data":"8e80dfb5765d248afe674a27c2d4176f44b0550b1040d1fe485099d3469b0483"} Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.753755 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerStarted","Data":"4e2fd861c05dc829291c98b5d8b1bcb60b0529b2a9143946350ab64d27d0fedb"} Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.790102 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/must-gather-p8cln" event={"ID":"c87e7492-30cc-4750-b99a-5ef41775cb9b","Type":"ContainerStarted","Data":"cb910f6e55060aec9edc01e87d8a3a82d3cd75870aeb5104d39297674a9916a9"} Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.800866 4651 generic.go:334] "Generic (PLEG): container finished" podID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerID="92a11f6dcc33ec746a919a48c24118c4f97b2ce3c2dc26142bbbf73c0e4336ce" exitCode=0 Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.800906 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerDied","Data":"92a11f6dcc33ec746a919a48c24118c4f97b2ce3c2dc26142bbbf73c0e4336ce"} Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.909536 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.912337 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.926886 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.927399 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.927587 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwb2\" (UniqueName: \"kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:57 crc kubenswrapper[4651]: I1126 15:13:57.937706 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.029182 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwb2\" (UniqueName: \"kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.029294 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.029378 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.029851 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.030538 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.062169 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwb2\" (UniqueName: \"kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2\") pod \"redhat-marketplace-ssqn4\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.110563 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.110789 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cw96g" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" containerID="cri-o://b516d291a20013b8f1eee0a170ed9a7a9a403adee5be808c3cae2495abf3a320" gracePeriod=2 Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.290767 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.866495 4651 generic.go:334] "Generic (PLEG): container finished" podID="5c231a23-6a60-4022-9e05-66aee576b01a" containerID="b516d291a20013b8f1eee0a170ed9a7a9a403adee5be808c3cae2495abf3a320" exitCode=0 Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.866700 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerDied","Data":"b516d291a20013b8f1eee0a170ed9a7a9a403adee5be808c3cae2495abf3a320"} Nov 26 15:13:58 crc kubenswrapper[4651]: I1126 15:13:58.969107 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.160189 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.278114 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content\") pod \"5c231a23-6a60-4022-9e05-66aee576b01a\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.278262 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvrvq\" (UniqueName: \"kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq\") pod \"5c231a23-6a60-4022-9e05-66aee576b01a\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.278452 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities\") pod \"5c231a23-6a60-4022-9e05-66aee576b01a\" (UID: \"5c231a23-6a60-4022-9e05-66aee576b01a\") " Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.279760 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities" (OuterVolumeSpecName: "utilities") pod "5c231a23-6a60-4022-9e05-66aee576b01a" (UID: "5c231a23-6a60-4022-9e05-66aee576b01a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.287645 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq" (OuterVolumeSpecName: "kube-api-access-gvrvq") pod "5c231a23-6a60-4022-9e05-66aee576b01a" (UID: "5c231a23-6a60-4022-9e05-66aee576b01a"). InnerVolumeSpecName "kube-api-access-gvrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.342927 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.381854 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.381900 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvrvq\" (UniqueName: \"kubernetes.io/projected/5c231a23-6a60-4022-9e05-66aee576b01a-kube-api-access-gvrvq\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.469928 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c231a23-6a60-4022-9e05-66aee576b01a" (UID: "5c231a23-6a60-4022-9e05-66aee576b01a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.484387 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c231a23-6a60-4022-9e05-66aee576b01a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.771170 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.903454 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerStarted","Data":"68ddb1bc4b4fbde710b792abee4525242136bdbfe20979fbdd5571cfd96cfa28"} Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.915792 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerStarted","Data":"68edc6cd1f1b1fbccf055e2d70957478b0298ad207f645974aa9c1c49561dfe0"} Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.919724 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.922021 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw96g" event={"ID":"5c231a23-6a60-4022-9e05-66aee576b01a","Type":"ContainerDied","Data":"c7ecfe5a2d40d1df35b36b2164ec827cbc4c64cbb3a0e7ed45c6e8f51ea73711"} Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.922077 4651 scope.go:117] "RemoveContainer" containerID="b516d291a20013b8f1eee0a170ed9a7a9a403adee5be808c3cae2495abf3a320" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.922195 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw96g" Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.930154 4651 generic.go:334] "Generic (PLEG): container finished" podID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerID="17c764b08c699b8817c803a1b5058962037489bd67fb8903a8a60dba0ca65713" exitCode=0 Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.930205 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerDied","Data":"17c764b08c699b8817c803a1b5058962037489bd67fb8903a8a60dba0ca65713"} Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.930234 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerStarted","Data":"34bfe1804040e60e0448b08754fc4bb3263d57e51e87fd9fa7883974eb48bef2"} Nov 26 15:13:59 crc kubenswrapper[4651]: I1126 15:13:59.997249 4651 scope.go:117] "RemoveContainer" containerID="04f06a03bca7ddabaec37db96bfbae8632a7e17ec860e3c4aa07c713e7b9c28e" Nov 26 15:14:00 crc kubenswrapper[4651]: E1126 15:14:00.023069 4651 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ff2ff6_77e0_46c9_b558_b2a749b4704d.slice/crio-conmon-17c764b08c699b8817c803a1b5058962037489bd67fb8903a8a60dba0ca65713.scope\": RecentStats: unable to find data in memory cache]" Nov 26 15:14:00 crc kubenswrapper[4651]: I1126 15:14:00.121239 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:14:00 crc kubenswrapper[4651]: I1126 15:14:00.143023 4651 scope.go:117] "RemoveContainer" containerID="5c1160bddb2e68af62f7943a9ee90d09eaafcb50ea36a6bcba5b6844e8d27c70" Nov 26 15:14:00 crc kubenswrapper[4651]: I1126 15:14:00.172632 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cw96g"] Nov 26 15:14:00 crc kubenswrapper[4651]: I1126 15:14:00.269291 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 26 15:14:00 crc kubenswrapper[4651]: I1126 15:14:00.457316 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 15:14:01 crc kubenswrapper[4651]: I1126 15:14:01.423639 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" path="/var/lib/kubelet/pods/5c231a23-6a60-4022-9e05-66aee576b01a/volumes" Nov 26 15:14:02 crc kubenswrapper[4651]: I1126 15:14:02.284006 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 15:14:02 crc kubenswrapper[4651]: I1126 15:14:02.970326 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerStarted","Data":"b2d30046b76df834b75c803ba4d863c6d916dc3dccc03dcc321cb7dc52cebca9"} Nov 26 15:14:03 crc kubenswrapper[4651]: I1126 15:14:03.998075 4651 generic.go:334] "Generic (PLEG): container finished" podID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerID="b2d30046b76df834b75c803ba4d863c6d916dc3dccc03dcc321cb7dc52cebca9" exitCode=0 Nov 26 15:14:03 crc kubenswrapper[4651]: I1126 15:14:03.998432 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerDied","Data":"b2d30046b76df834b75c803ba4d863c6d916dc3dccc03dcc321cb7dc52cebca9"} Nov 26 15:14:04 crc kubenswrapper[4651]: I1126 15:14:04.019543 4651 generic.go:334] "Generic (PLEG): container finished" podID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerID="68ddb1bc4b4fbde710b792abee4525242136bdbfe20979fbdd5571cfd96cfa28" exitCode=0 Nov 26 15:14:04 crc kubenswrapper[4651]: I1126 15:14:04.019601 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerDied","Data":"68ddb1bc4b4fbde710b792abee4525242136bdbfe20979fbdd5571cfd96cfa28"} Nov 26 15:14:04 crc kubenswrapper[4651]: I1126 15:14:04.046944 4651 generic.go:334] "Generic (PLEG): container finished" podID="4185f823-c76c-46a8-86bb-13901074765f" containerID="68edc6cd1f1b1fbccf055e2d70957478b0298ad207f645974aa9c1c49561dfe0" exitCode=0 Nov 26 15:14:04 crc kubenswrapper[4651]: I1126 15:14:04.047014 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerDied","Data":"68edc6cd1f1b1fbccf055e2d70957478b0298ad207f645974aa9c1c49561dfe0"} Nov 26 15:14:08 crc kubenswrapper[4651]: I1126 15:14:08.749956 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.139003 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/must-gather-p8cln" event={"ID":"c87e7492-30cc-4750-b99a-5ef41775cb9b","Type":"ContainerStarted","Data":"6e469a2d22cf80c382e0e65a313ac0d3dfee15d99495cd3d452de19707ef77e8"} Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.139668 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/must-gather-p8cln" event={"ID":"c87e7492-30cc-4750-b99a-5ef41775cb9b","Type":"ContainerStarted","Data":"429ae40c99b5c5713b5e0fe50030a9d73098925b7685c218e7f330f5a06109a9"} Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.144410 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerStarted","Data":"3daaa18dbe986a5ee0992865b98ac9830695ee128fc0e3ceff9fe0e89ffb839f"} Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.147562 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerStarted","Data":"7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494"} Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.150214 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerStarted","Data":"8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f"} Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.165755 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdvq7/must-gather-p8cln" podStartSLOduration=3.962537326 podStartE2EDuration="19.165734293s" podCreationTimestamp="2025-11-26 15:13:54 +0000 UTC" firstStartedPulling="2025-11-26 15:13:56.906239801 +0000 UTC m=+1404.331987395" lastFinishedPulling="2025-11-26 15:14:12.109436768 +0000 UTC m=+1419.535184362" observedRunningTime="2025-11-26 15:14:13.151204912 +0000 UTC m=+1420.576952526" watchObservedRunningTime="2025-11-26 15:14:13.165734293 +0000 UTC m=+1420.591481907" Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.243171 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8jb5" podStartSLOduration=3.941929448 podStartE2EDuration="18.243152701s" podCreationTimestamp="2025-11-26 15:13:55 +0000 UTC" firstStartedPulling="2025-11-26 15:13:57.760345573 +0000 UTC m=+1405.186093177" lastFinishedPulling="2025-11-26 15:14:12.061568826 +0000 UTC m=+1419.487316430" observedRunningTime="2025-11-26 15:14:13.192946474 +0000 UTC m=+1420.618694088" watchObservedRunningTime="2025-11-26 15:14:13.243152701 +0000 UTC m=+1420.668900305" Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.267080 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rjjbw" podStartSLOduration=3.992993818 podStartE2EDuration="18.267060341s" podCreationTimestamp="2025-11-26 15:13:55 +0000 UTC" firstStartedPulling="2025-11-26 15:13:57.809736697 +0000 UTC m=+1405.235484301" lastFinishedPulling="2025-11-26 15:14:12.08380322 +0000 UTC m=+1419.509550824" observedRunningTime="2025-11-26 15:14:13.224527026 +0000 UTC m=+1420.650274660" watchObservedRunningTime="2025-11-26 15:14:13.267060341 +0000 UTC m=+1420.692807945" Nov 26 15:14:13 crc kubenswrapper[4651]: I1126 15:14:13.284476 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ssqn4" podStartSLOduration=4.157420998 podStartE2EDuration="16.284457771s" podCreationTimestamp="2025-11-26 15:13:57 +0000 UTC" firstStartedPulling="2025-11-26 15:13:59.935169102 +0000 UTC m=+1407.360916706" lastFinishedPulling="2025-11-26 15:14:12.062205875 +0000 UTC m=+1419.487953479" observedRunningTime="2025-11-26 15:14:13.254579626 +0000 UTC m=+1420.680327240" watchObservedRunningTime="2025-11-26 15:14:13.284457771 +0000 UTC m=+1420.710205375" Nov 26 15:14:15 crc kubenswrapper[4651]: I1126 15:14:15.468857 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:15 crc kubenswrapper[4651]: I1126 15:14:15.468950 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:15 crc kubenswrapper[4651]: I1126 15:14:15.629938 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:15 crc kubenswrapper[4651]: I1126 15:14:15.630003 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:16 crc kubenswrapper[4651]: I1126 15:14:16.384489 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b5d786cf6-wsrgh" Nov 26 15:14:16 crc kubenswrapper[4651]: I1126 15:14:16.518367 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rjjbw" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:16 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:16 crc kubenswrapper[4651]: > Nov 26 15:14:16 crc kubenswrapper[4651]: I1126 15:14:16.688350 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t8jb5" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:16 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:16 crc kubenswrapper[4651]: > Nov 26 15:14:18 crc kubenswrapper[4651]: I1126 15:14:18.292226 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:18 crc kubenswrapper[4651]: I1126 15:14:18.292573 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.345356 4651 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ssqn4" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="registry-server" probeResult="failure" output=< Nov 26 15:14:19 crc kubenswrapper[4651]: timeout: failed to connect service ":50051" within 1s Nov 26 15:14:19 crc kubenswrapper[4651]: > Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.929429 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-j4lbs"] Nov 26 15:14:19 crc kubenswrapper[4651]: E1126 15:14:19.930182 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.930204 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" Nov 26 15:14:19 crc kubenswrapper[4651]: E1126 15:14:19.930243 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="extract-utilities" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.930252 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="extract-utilities" Nov 26 15:14:19 crc kubenswrapper[4651]: E1126 15:14:19.930276 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="extract-content" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.930284 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="extract-content" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.930504 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c231a23-6a60-4022-9e05-66aee576b01a" containerName="registry-server" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.931299 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:19 crc kubenswrapper[4651]: I1126 15:14:19.933655 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kdvq7"/"default-dockercfg-dhs4n" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.065086 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvvp6\" (UniqueName: \"kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.065196 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.167048 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvvp6\" (UniqueName: \"kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.167179 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.167319 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.191959 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvvp6\" (UniqueName: \"kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6\") pod \"crc-debug-j4lbs\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: I1126 15:14:20.249488 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:14:20 crc kubenswrapper[4651]: W1126 15:14:20.287951 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e037d6_caca_410c_9f5d_c7b864a6f46e.slice/crio-ecc7d9b425a427b20276c7036fb435337937a93883f9548d60b0e0c056f483d8 WatchSource:0}: Error finding container ecc7d9b425a427b20276c7036fb435337937a93883f9548d60b0e0c056f483d8: Status 404 returned error can't find the container with id ecc7d9b425a427b20276c7036fb435337937a93883f9548d60b0e0c056f483d8 Nov 26 15:14:21 crc kubenswrapper[4651]: I1126 15:14:21.234860 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" event={"ID":"a4e037d6-caca-410c-9f5d-c7b864a6f46e","Type":"ContainerStarted","Data":"ecc7d9b425a427b20276c7036fb435337937a93883f9548d60b0e0c056f483d8"} Nov 26 15:14:25 crc kubenswrapper[4651]: I1126 15:14:25.527748 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:25 crc kubenswrapper[4651]: I1126 15:14:25.589554 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:25 crc kubenswrapper[4651]: I1126 15:14:25.680055 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:25 crc kubenswrapper[4651]: I1126 15:14:25.738855 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:26 crc kubenswrapper[4651]: I1126 15:14:26.935943 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:14:27 crc kubenswrapper[4651]: I1126 15:14:27.280429 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rjjbw" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" containerID="cri-o://7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" gracePeriod=2 Nov 26 15:14:27 crc kubenswrapper[4651]: I1126 15:14:27.936835 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:14:27 crc kubenswrapper[4651]: I1126 15:14:27.937119 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8jb5" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" containerID="cri-o://8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" gracePeriod=2 Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.291816 4651 generic.go:334] "Generic (PLEG): container finished" podID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerID="7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" exitCode=0 Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.291872 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerDied","Data":"7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494"} Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.293725 4651 generic.go:334] "Generic (PLEG): container finished" podID="4185f823-c76c-46a8-86bb-13901074765f" containerID="8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" exitCode=0 Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.293749 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerDied","Data":"8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f"} Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.357876 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:28 crc kubenswrapper[4651]: I1126 15:14:28.415448 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:29 crc kubenswrapper[4651]: I1126 15:14:29.132545 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:29 crc kubenswrapper[4651]: I1126 15:14:29.132908 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:30 crc kubenswrapper[4651]: I1126 15:14:30.336548 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:14:30 crc kubenswrapper[4651]: I1126 15:14:30.336954 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ssqn4" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="registry-server" containerID="cri-o://3daaa18dbe986a5ee0992865b98ac9830695ee128fc0e3ceff9fe0e89ffb839f" gracePeriod=2 Nov 26 15:14:31 crc kubenswrapper[4651]: I1126 15:14:31.324205 4651 generic.go:334] "Generic (PLEG): container finished" podID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerID="3daaa18dbe986a5ee0992865b98ac9830695ee128fc0e3ceff9fe0e89ffb839f" exitCode=0 Nov 26 15:14:31 crc kubenswrapper[4651]: I1126 15:14:31.324433 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerDied","Data":"3daaa18dbe986a5ee0992865b98ac9830695ee128fc0e3ceff9fe0e89ffb839f"} Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.382252 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" event={"ID":"a4e037d6-caca-410c-9f5d-c7b864a6f46e","Type":"ContainerStarted","Data":"9374228de5524db856c88b30e7314f6317dfb826f5fa40806aa83d44ca6be164"} Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.420042 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" podStartSLOduration=1.668452552 podStartE2EDuration="16.420020874s" podCreationTimestamp="2025-11-26 15:14:19 +0000 UTC" firstStartedPulling="2025-11-26 15:14:20.294509508 +0000 UTC m=+1427.720257112" lastFinishedPulling="2025-11-26 15:14:35.04607783 +0000 UTC m=+1442.471825434" observedRunningTime="2025-11-26 15:14:35.413766519 +0000 UTC m=+1442.839514123" watchObservedRunningTime="2025-11-26 15:14:35.420020874 +0000 UTC m=+1442.845768478" Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.475847 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494 is running failed: container process not found" containerID="7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.477546 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494 is running failed: container process not found" containerID="7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.483857 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494 is running failed: container process not found" containerID="7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.483923 4651 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-rjjbw" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.630747 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f is running failed: container process not found" containerID="8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.631378 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f is running failed: container process not found" containerID="8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.631670 4651 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f is running failed: container process not found" containerID="8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:14:35 crc kubenswrapper[4651]: E1126 15:14:35.631717 4651 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-t8jb5" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.880169 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.974928 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content\") pod \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.975238 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf4d4\" (UniqueName: \"kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4\") pod \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.975282 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities\") pod \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\" (UID: \"8e09c768-ab48-443b-92fd-e14e83ad5d9e\") " Nov 26 15:14:35 crc kubenswrapper[4651]: I1126 15:14:35.983766 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities" (OuterVolumeSpecName: "utilities") pod "8e09c768-ab48-443b-92fd-e14e83ad5d9e" (UID: "8e09c768-ab48-443b-92fd-e14e83ad5d9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.002469 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4" (OuterVolumeSpecName: "kube-api-access-lf4d4") pod "8e09c768-ab48-443b-92fd-e14e83ad5d9e" (UID: "8e09c768-ab48-443b-92fd-e14e83ad5d9e"). InnerVolumeSpecName "kube-api-access-lf4d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.047400 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e09c768-ab48-443b-92fd-e14e83ad5d9e" (UID: "8e09c768-ab48-443b-92fd-e14e83ad5d9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.072255 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.077739 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.077775 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf4d4\" (UniqueName: \"kubernetes.io/projected/8e09c768-ab48-443b-92fd-e14e83ad5d9e-kube-api-access-lf4d4\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.077787 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e09c768-ab48-443b-92fd-e14e83ad5d9e-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.085078 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182024 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content\") pod \"4185f823-c76c-46a8-86bb-13901074765f\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182307 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities\") pod \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182344 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljwb2\" (UniqueName: \"kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2\") pod \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182387 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content\") pod \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\" (UID: \"53ff2ff6-77e0-46c9-b558-b2a749b4704d\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182476 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities\") pod \"4185f823-c76c-46a8-86bb-13901074765f\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.182494 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frb6v\" (UniqueName: \"kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v\") pod \"4185f823-c76c-46a8-86bb-13901074765f\" (UID: \"4185f823-c76c-46a8-86bb-13901074765f\") " Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.192647 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2" (OuterVolumeSpecName: "kube-api-access-ljwb2") pod "53ff2ff6-77e0-46c9-b558-b2a749b4704d" (UID: "53ff2ff6-77e0-46c9-b558-b2a749b4704d"). InnerVolumeSpecName "kube-api-access-ljwb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.195822 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities" (OuterVolumeSpecName: "utilities") pod "53ff2ff6-77e0-46c9-b558-b2a749b4704d" (UID: "53ff2ff6-77e0-46c9-b558-b2a749b4704d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.201810 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities" (OuterVolumeSpecName: "utilities") pod "4185f823-c76c-46a8-86bb-13901074765f" (UID: "4185f823-c76c-46a8-86bb-13901074765f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.213463 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53ff2ff6-77e0-46c9-b558-b2a749b4704d" (UID: "53ff2ff6-77e0-46c9-b558-b2a749b4704d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.217846 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v" (OuterVolumeSpecName: "kube-api-access-frb6v") pod "4185f823-c76c-46a8-86bb-13901074765f" (UID: "4185f823-c76c-46a8-86bb-13901074765f"). InnerVolumeSpecName "kube-api-access-frb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.261581 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4185f823-c76c-46a8-86bb-13901074765f" (UID: "4185f823-c76c-46a8-86bb-13901074765f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285241 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285286 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285322 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljwb2\" (UniqueName: \"kubernetes.io/projected/53ff2ff6-77e0-46c9-b558-b2a749b4704d-kube-api-access-ljwb2\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285336 4651 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ff2ff6-77e0-46c9-b558-b2a749b4704d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285349 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frb6v\" (UniqueName: \"kubernetes.io/projected/4185f823-c76c-46a8-86bb-13901074765f-kube-api-access-frb6v\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.285360 4651 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185f823-c76c-46a8-86bb-13901074765f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.397737 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8jb5" event={"ID":"4185f823-c76c-46a8-86bb-13901074765f","Type":"ContainerDied","Data":"4e2fd861c05dc829291c98b5d8b1bcb60b0529b2a9143946350ab64d27d0fedb"} Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.397795 4651 scope.go:117] "RemoveContainer" containerID="8888b03bf1e20c345d20ddd48196372b84a54263c5dc8d274bdd6619bd9c045f" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.397940 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8jb5" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.413430 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssqn4" event={"ID":"53ff2ff6-77e0-46c9-b558-b2a749b4704d","Type":"ContainerDied","Data":"34bfe1804040e60e0448b08754fc4bb3263d57e51e87fd9fa7883974eb48bef2"} Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.413571 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssqn4" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.424200 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rjjbw" event={"ID":"8e09c768-ab48-443b-92fd-e14e83ad5d9e","Type":"ContainerDied","Data":"249c04bf505cbc0003826f5faccd13688e8e8ebfaec9d73dd89e1963bd110c62"} Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.424255 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rjjbw" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.450256 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.484435 4651 scope.go:117] "RemoveContainer" containerID="68edc6cd1f1b1fbccf055e2d70957478b0298ad207f645974aa9c1c49561dfe0" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.484558 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8jb5"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.513860 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.543386 4651 scope.go:117] "RemoveContainer" containerID="8e80dfb5765d248afe674a27c2d4176f44b0550b1040d1fe485099d3469b0483" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.547145 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssqn4"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.557656 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.569657 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rjjbw"] Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.597545 4651 scope.go:117] "RemoveContainer" containerID="3daaa18dbe986a5ee0992865b98ac9830695ee128fc0e3ceff9fe0e89ffb839f" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.626571 4651 scope.go:117] "RemoveContainer" containerID="b2d30046b76df834b75c803ba4d863c6d916dc3dccc03dcc321cb7dc52cebca9" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.650565 4651 scope.go:117] "RemoveContainer" containerID="17c764b08c699b8817c803a1b5058962037489bd67fb8903a8a60dba0ca65713" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.671446 4651 scope.go:117] "RemoveContainer" containerID="7ff7c93702a3b0c3ec97b9702c4fbd9f95fea12f196fd65d82fb5f7de92b2494" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.702877 4651 scope.go:117] "RemoveContainer" containerID="68ddb1bc4b4fbde710b792abee4525242136bdbfe20979fbdd5571cfd96cfa28" Nov 26 15:14:36 crc kubenswrapper[4651]: I1126 15:14:36.729726 4651 scope.go:117] "RemoveContainer" containerID="92a11f6dcc33ec746a919a48c24118c4f97b2ce3c2dc26142bbbf73c0e4336ce" Nov 26 15:14:37 crc kubenswrapper[4651]: I1126 15:14:37.415343 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4185f823-c76c-46a8-86bb-13901074765f" path="/var/lib/kubelet/pods/4185f823-c76c-46a8-86bb-13901074765f/volumes" Nov 26 15:14:37 crc kubenswrapper[4651]: I1126 15:14:37.418055 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" path="/var/lib/kubelet/pods/53ff2ff6-77e0-46c9-b558-b2a749b4704d/volumes" Nov 26 15:14:37 crc kubenswrapper[4651]: I1126 15:14:37.418865 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" path="/var/lib/kubelet/pods/8e09c768-ab48-443b-92fd-e14e83ad5d9e/volumes" Nov 26 15:14:59 crc kubenswrapper[4651]: I1126 15:14:59.133127 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:59 crc kubenswrapper[4651]: I1126 15:14:59.133720 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:59 crc kubenswrapper[4651]: I1126 15:14:59.668680 4651 generic.go:334] "Generic (PLEG): container finished" podID="a4e037d6-caca-410c-9f5d-c7b864a6f46e" containerID="9374228de5524db856c88b30e7314f6317dfb826f5fa40806aa83d44ca6be164" exitCode=0 Nov 26 15:14:59 crc kubenswrapper[4651]: I1126 15:14:59.668795 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" event={"ID":"a4e037d6-caca-410c-9f5d-c7b864a6f46e","Type":"ContainerDied","Data":"9374228de5524db856c88b30e7314f6317dfb826f5fa40806aa83d44ca6be164"} Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218394 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj"] Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218789 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218802 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218812 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218819 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218832 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218838 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218853 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218859 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218870 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218875 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218886 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218892 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218906 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218912 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="extract-utilities" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218931 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218936 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: E1126 15:15:00.218952 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.218960 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="extract-content" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.219140 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ff2ff6-77e0-46c9-b558-b2a749b4704d" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.219163 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="4185f823-c76c-46a8-86bb-13901074765f" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.219176 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e09c768-ab48-443b-92fd-e14e83ad5d9e" containerName="registry-server" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.219900 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.222399 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.223544 4651 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.238578 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj"] Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.364397 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.364502 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qnc\" (UniqueName: \"kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.364553 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.466258 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.466407 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.466506 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qnc\" (UniqueName: \"kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.467698 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.486809 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.495870 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qnc\" (UniqueName: \"kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc\") pod \"collect-profiles-29402835-hlxqj\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.540646 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.758518 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.804497 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-j4lbs"] Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.813957 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-j4lbs"] Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.878456 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host\") pod \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.878716 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvvp6\" (UniqueName: \"kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6\") pod \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\" (UID: \"a4e037d6-caca-410c-9f5d-c7b864a6f46e\") " Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.878757 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host" (OuterVolumeSpecName: "host") pod "a4e037d6-caca-410c-9f5d-c7b864a6f46e" (UID: "a4e037d6-caca-410c-9f5d-c7b864a6f46e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.879214 4651 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4e037d6-caca-410c-9f5d-c7b864a6f46e-host\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.889304 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6" (OuterVolumeSpecName: "kube-api-access-bvvp6") pod "a4e037d6-caca-410c-9f5d-c7b864a6f46e" (UID: "a4e037d6-caca-410c-9f5d-c7b864a6f46e"). InnerVolumeSpecName "kube-api-access-bvvp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:00 crc kubenswrapper[4651]: I1126 15:15:00.980591 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvvp6\" (UniqueName: \"kubernetes.io/projected/a4e037d6-caca-410c-9f5d-c7b864a6f46e-kube-api-access-bvvp6\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.221767 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj"] Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.412792 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e037d6-caca-410c-9f5d-c7b864a6f46e" path="/var/lib/kubelet/pods/a4e037d6-caca-410c-9f5d-c7b864a6f46e/volumes" Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.692152 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" event={"ID":"29cdb444-80a3-488d-ab43-029c8e41210e","Type":"ContainerStarted","Data":"36c5025c7bba7ea0ddd9560a84ffcdb3275f85722a6cebfea45957939558d7b0"} Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.692196 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" event={"ID":"29cdb444-80a3-488d-ab43-029c8e41210e","Type":"ContainerStarted","Data":"b7ff45a1c207dd46f0b98cadae6b557f1e44fb5a1e52d35088d7aba6083fa577"} Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.697785 4651 scope.go:117] "RemoveContainer" containerID="9374228de5524db856c88b30e7314f6317dfb826f5fa40806aa83d44ca6be164" Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.697818 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-j4lbs" Nov 26 15:15:01 crc kubenswrapper[4651]: I1126 15:15:01.727854 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" podStartSLOduration=1.7278327 podStartE2EDuration="1.7278327s" podCreationTimestamp="2025-11-26 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:01.709322972 +0000 UTC m=+1469.135070586" watchObservedRunningTime="2025-11-26 15:15:01.7278327 +0000 UTC m=+1469.153580304" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.076548 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-bz7j7"] Nov 26 15:15:02 crc kubenswrapper[4651]: E1126 15:15:02.077362 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e037d6-caca-410c-9f5d-c7b864a6f46e" containerName="container-00" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.077389 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e037d6-caca-410c-9f5d-c7b864a6f46e" containerName="container-00" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.077666 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e037d6-caca-410c-9f5d-c7b864a6f46e" containerName="container-00" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.078476 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.081961 4651 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kdvq7"/"default-dockercfg-dhs4n" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.211168 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2bq\" (UniqueName: \"kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.211277 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.312963 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2bq\" (UniqueName: \"kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.313082 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.313230 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.336136 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2bq\" (UniqueName: \"kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq\") pod \"crc-debug-bz7j7\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.395342 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.709328 4651 generic.go:334] "Generic (PLEG): container finished" podID="29cdb444-80a3-488d-ab43-029c8e41210e" containerID="36c5025c7bba7ea0ddd9560a84ffcdb3275f85722a6cebfea45957939558d7b0" exitCode=0 Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.709407 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" event={"ID":"29cdb444-80a3-488d-ab43-029c8e41210e","Type":"ContainerDied","Data":"36c5025c7bba7ea0ddd9560a84ffcdb3275f85722a6cebfea45957939558d7b0"} Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.711693 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" event={"ID":"16215dc1-5ab5-4e53-9b79-25c52f86147c","Type":"ContainerStarted","Data":"730da133cc18ff35f35cb64293c46a24c7fba1b9a93d37f0215ec8ab147a77f6"} Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.711722 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" event={"ID":"16215dc1-5ab5-4e53-9b79-25c52f86147c","Type":"ContainerStarted","Data":"077add504d5f320688c9e580afcf809f69af8baa19b6b6b73c70ceeb7edad183"} Nov 26 15:15:02 crc kubenswrapper[4651]: I1126 15:15:02.751189 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" podStartSLOduration=0.751170704 podStartE2EDuration="751.170704ms" podCreationTimestamp="2025-11-26 15:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:02.750492633 +0000 UTC m=+1470.176240237" watchObservedRunningTime="2025-11-26 15:15:02.751170704 +0000 UTC m=+1470.176918308" Nov 26 15:15:03 crc kubenswrapper[4651]: I1126 15:15:03.724727 4651 generic.go:334] "Generic (PLEG): container finished" podID="16215dc1-5ab5-4e53-9b79-25c52f86147c" containerID="730da133cc18ff35f35cb64293c46a24c7fba1b9a93d37f0215ec8ab147a77f6" exitCode=1 Nov 26 15:15:03 crc kubenswrapper[4651]: I1126 15:15:03.724779 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" event={"ID":"16215dc1-5ab5-4e53-9b79-25c52f86147c","Type":"ContainerDied","Data":"730da133cc18ff35f35cb64293c46a24c7fba1b9a93d37f0215ec8ab147a77f6"} Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.207396 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.347257 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume\") pod \"29cdb444-80a3-488d-ab43-029c8e41210e\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.347396 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qnc\" (UniqueName: \"kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc\") pod \"29cdb444-80a3-488d-ab43-029c8e41210e\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.347425 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume\") pod \"29cdb444-80a3-488d-ab43-029c8e41210e\" (UID: \"29cdb444-80a3-488d-ab43-029c8e41210e\") " Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.347934 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume" (OuterVolumeSpecName: "config-volume") pod "29cdb444-80a3-488d-ab43-029c8e41210e" (UID: "29cdb444-80a3-488d-ab43-029c8e41210e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.361250 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29cdb444-80a3-488d-ab43-029c8e41210e" (UID: "29cdb444-80a3-488d-ab43-029c8e41210e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.372678 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc" (OuterVolumeSpecName: "kube-api-access-m5qnc") pod "29cdb444-80a3-488d-ab43-029c8e41210e" (UID: "29cdb444-80a3-488d-ab43-029c8e41210e"). InnerVolumeSpecName "kube-api-access-m5qnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.449841 4651 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29cdb444-80a3-488d-ab43-029c8e41210e-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.449883 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qnc\" (UniqueName: \"kubernetes.io/projected/29cdb444-80a3-488d-ab43-029c8e41210e-kube-api-access-m5qnc\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.449897 4651 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29cdb444-80a3-488d-ab43-029c8e41210e-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.736013 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.736164 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-hlxqj" event={"ID":"29cdb444-80a3-488d-ab43-029c8e41210e","Type":"ContainerDied","Data":"b7ff45a1c207dd46f0b98cadae6b557f1e44fb5a1e52d35088d7aba6083fa577"} Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.736492 4651 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7ff45a1c207dd46f0b98cadae6b557f1e44fb5a1e52d35088d7aba6083fa577" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.808168 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.851797 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-bz7j7"] Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.879669 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdvq7/crc-debug-bz7j7"] Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.964823 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host\") pod \"16215dc1-5ab5-4e53-9b79-25c52f86147c\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.964974 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host" (OuterVolumeSpecName: "host") pod "16215dc1-5ab5-4e53-9b79-25c52f86147c" (UID: "16215dc1-5ab5-4e53-9b79-25c52f86147c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.965131 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2bq\" (UniqueName: \"kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq\") pod \"16215dc1-5ab5-4e53-9b79-25c52f86147c\" (UID: \"16215dc1-5ab5-4e53-9b79-25c52f86147c\") " Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.965591 4651 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16215dc1-5ab5-4e53-9b79-25c52f86147c-host\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:04 crc kubenswrapper[4651]: I1126 15:15:04.969435 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq" (OuterVolumeSpecName: "kube-api-access-tn2bq") pod "16215dc1-5ab5-4e53-9b79-25c52f86147c" (UID: "16215dc1-5ab5-4e53-9b79-25c52f86147c"). InnerVolumeSpecName "kube-api-access-tn2bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:05 crc kubenswrapper[4651]: I1126 15:15:05.067610 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2bq\" (UniqueName: \"kubernetes.io/projected/16215dc1-5ab5-4e53-9b79-25c52f86147c-kube-api-access-tn2bq\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:05 crc kubenswrapper[4651]: I1126 15:15:05.414297 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16215dc1-5ab5-4e53-9b79-25c52f86147c" path="/var/lib/kubelet/pods/16215dc1-5ab5-4e53-9b79-25c52f86147c/volumes" Nov 26 15:15:05 crc kubenswrapper[4651]: I1126 15:15:05.749384 4651 scope.go:117] "RemoveContainer" containerID="730da133cc18ff35f35cb64293c46a24c7fba1b9a93d37f0215ec8ab147a77f6" Nov 26 15:15:05 crc kubenswrapper[4651]: I1126 15:15:05.749517 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/crc-debug-bz7j7" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.132337 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.132782 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.132824 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.133543 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.133593 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" gracePeriod=600 Nov 26 15:15:29 crc kubenswrapper[4651]: E1126 15:15:29.251400 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.971894 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" exitCode=0 Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.971932 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14"} Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.971962 4651 scope.go:117] "RemoveContainer" containerID="743e37a0879fef7149021c1c72d47f0f5826caa510cee0fbc25f23140cbdb919" Nov 26 15:15:29 crc kubenswrapper[4651]: I1126 15:15:29.972513 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:15:29 crc kubenswrapper[4651]: E1126 15:15:29.972832 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.000156 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5688f744d6-ck9mn_38335944-e310-47d9-b2c1-c6f931134e10/barbican-api/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.112627 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5688f744d6-ck9mn_38335944-e310-47d9-b2c1-c6f931134e10/barbican-api-log/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.251001 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-c56c-account-create-update-j5bf5_0a7b363a-a7d4-4197-b711-2d3a0b761273/mariadb-account-create-update/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.359556 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-6fkm4_f46f23b6-3605-4160-a29e-b7f2a84b48f5/mariadb-database-create/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.492301 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-p6s6f_81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b/barbican-db-sync/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.573841 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7955b99f58-492wx_c7e365e4-f902-439d-92e3-de43fd6ccdaf/barbican-keystone-listener/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.693723 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7955b99f58-492wx_c7e365e4-f902-439d-92e3-de43fd6ccdaf/barbican-keystone-listener-log/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.815279 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848fcb696c-vfdpx_bf420acf-d3d6-45d2-a484-66265c5a1bcd/barbican-worker/0.log" Nov 26 15:15:33 crc kubenswrapper[4651]: I1126 15:15:33.844736 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848fcb696c-vfdpx_bf420acf-d3d6-45d2-a484-66265c5a1bcd/barbican-worker-log/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.062233 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f683f0-e63a-41d1-9c75-adc0175d9c9c/ceilometer-notification-agent/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.071509 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f683f0-e63a-41d1-9c75-adc0175d9c9c/ceilometer-central-agent/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.197707 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f683f0-e63a-41d1-9c75-adc0175d9c9c/proxy-httpd/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.231702 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b7f683f0-e63a-41d1-9c75-adc0175d9c9c/sg-core/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.278541 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-3572-account-create-update-b7cth_8d514364-b561-4d18-9b82-bfd428216060/mariadb-account-create-update/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.532576 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f93566af-1613-4989-a326-26aa8cc4447c/cinder-api/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.596754 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f93566af-1613-4989-a326-26aa8cc4447c/cinder-api-log/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.705297 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-28wwh_a78599f6-a349-4abd-b862-37ea4d85818d/mariadb-database-create/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.834200 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-wzxcr_0b39efce-2985-4f46-91a2-bb397f605c9c/cinder-db-sync/0.log" Nov 26 15:15:34 crc kubenswrapper[4651]: I1126 15:15:34.961990 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e27ec632-c98c-4da3-a998-299e18d1bc99/cinder-scheduler/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.028062 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e27ec632-c98c-4da3-a998-299e18d1bc99/probe/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.200989 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85649f948c-kg5nr_3bc609df-69f6-442d-80fb-f070eb8b674c/init/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.319480 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85649f948c-kg5nr_3bc609df-69f6-442d-80fb-f070eb8b674c/init/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.391055 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85649f948c-kg5nr_3bc609df-69f6-442d-80fb-f070eb8b674c/dnsmasq-dns/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.456759 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-d20d-account-create-update-56vgl_eb658c85-2b54-43fb-9938-0c5558ae3da8/mariadb-account-create-update/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.644798 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-ss5m4_9e5f579e-17c0-425c-bd60-0f4950eabdc8/mariadb-database-create/0.log" Nov 26 15:15:35 crc kubenswrapper[4651]: I1126 15:15:35.822250 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-6ld6v_0dfcc6ac-236d-4333-9126-5ee10d1e0417/glance-db-sync/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.056697 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_49ff8933-59b2-4620-a03d-a14767db747d/glance-httpd/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.068105 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_49ff8933-59b2-4620-a03d-a14767db747d/glance-log/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.256614 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2760c509-e00a-4aab-9471-0cf7f5177471/glance-httpd/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.304721 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2760c509-e00a-4aab-9471-0cf7f5177471/glance-log/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.510159 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f54c7c77d-rx8gm_5c09de21-84b0-440d-b34c-3054ec6741fc/horizon/2.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.583077 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f54c7c77d-rx8gm_5c09de21-84b0-440d-b34c-3054ec6741fc/horizon/1.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.620522 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f54c7c77d-rx8gm_5c09de21-84b0-440d-b34c-3054ec6741fc/horizon-log/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.848978 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d45c4597-qv4b7_1c55f267-1433-4e39-9f2e-a2a8e038dbb5/keystone-api/0.log" Nov 26 15:15:36 crc kubenswrapper[4651]: I1126 15:15:36.895150 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-24pqd_2eca4c8f-cc45-46f6-8730-187af536d3b1/keystone-bootstrap/0.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.061298 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c181-account-create-update-prbwt_a2667dc1-777a-469d-8021-ff0881c8d0d2/mariadb-account-create-update/0.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.155764 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-v6vlh_704d880b-4c5f-4663-9bb1-8e40fa9b6752/mariadb-database-create/0.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.323665 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-72sf9_ae123901-25f9-4788-b666-bcb72066c3c4/keystone-db-sync/0.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.514611 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8c2d03fc-6edd-4654-8116-99aae88e3fab/kube-state-metrics/2.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.547334 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8c2d03fc-6edd-4654-8116-99aae88e3fab/kube-state-metrics/3.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.843878 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5659bd5cb9-6wbmd_792fb64e-7839-4871-9f5e-3799da118c4d/neutron-httpd/0.log" Nov 26 15:15:37 crc kubenswrapper[4651]: I1126 15:15:37.855782 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5659bd5cb9-6wbmd_792fb64e-7839-4871-9f5e-3799da118c4d/neutron-api/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.054872 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bb66-account-create-update-s2ll5_3bf0489a-9b4f-4cd4-95a8-42a5fd115b89/mariadb-account-create-update/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.210579 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-wvffz_88ffdfc1-d77f-4094-a0ba-2800d4c4d878/mariadb-database-create/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.357867 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-9glnv_147296af-97b7-4982-ab39-d7f3b78f042d/neutron-db-sync/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.665907 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c4174859-235d-4a45-ba34-178b888f3513/nova-api-api/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.730532 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c4174859-235d-4a45-ba34-178b888f3513/nova-api-log/0.log" Nov 26 15:15:38 crc kubenswrapper[4651]: I1126 15:15:38.873857 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-jjrbk_b5279318-bb6d-455f-96a5-d410d0468c6b/mariadb-database-create/0.log" Nov 26 15:15:39 crc kubenswrapper[4651]: I1126 15:15:39.003415 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-e077-account-create-update-9ntsd_cce0bade-306a-4aa8-bbff-a24a79d73e22/mariadb-account-create-update/0.log" Nov 26 15:15:39 crc kubenswrapper[4651]: I1126 15:15:39.226290 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-77b2-account-create-update-v7shh_e9edcf88-8f1d-419e-afd8-5e5861d5b5ad/mariadb-account-create-update/0.log" Nov 26 15:15:39 crc kubenswrapper[4651]: I1126 15:15:39.407922 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-tpjxc_ad85fcab-3573-4019-89bc-f35413ff0a9d/nova-manage/0.log" Nov 26 15:15:39 crc kubenswrapper[4651]: I1126 15:15:39.633285 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a4e5f466-486a-4c2c-8cd7-528169932031/nova-cell0-conductor-conductor/0.log" Nov 26 15:15:39 crc kubenswrapper[4651]: I1126 15:15:39.720690 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-shlzt_20922a1a-1763-45a9-911a-161e1fc4bd1e/nova-cell0-conductor-db-sync/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.055340 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-n2799_5053f0e2-7865-4be5-9601-2c69da731509/mariadb-database-create/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.120376 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-aa1b-account-create-update-zsh2r_dc0af2b0-0d8a-488e-97d3-5956869cd9e9/mariadb-account-create-update/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.355201 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-48lfs_41567ca0-5457-4763-a8f9-b28588b4b7b1/nova-manage/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.550875 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d317ad85-fe1e-4e2d-b7ac-745c2efb8a44/nova-cell1-conductor-conductor/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.644973 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-qmd5q_86a2f131-c449-4541-9822-75711dee8ad3/nova-cell1-conductor-db-sync/0.log" Nov 26 15:15:40 crc kubenswrapper[4651]: I1126 15:15:40.849340 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-7k6w2_34e0710f-88c5-4a6a-96d5-97f4a934eeed/mariadb-database-create/0.log" Nov 26 15:15:41 crc kubenswrapper[4651]: I1126 15:15:41.034353 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d5bc7701-632a-44c9-b812-0314af81833a/nova-cell1-novncproxy-novncproxy/0.log" Nov 26 15:15:41 crc kubenswrapper[4651]: I1126 15:15:41.407768 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_71cdc146-583a-403a-9758-de42eea152de/nova-metadata-log/0.log" Nov 26 15:15:41 crc kubenswrapper[4651]: I1126 15:15:41.429721 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_71cdc146-583a-403a-9758-de42eea152de/nova-metadata-metadata/0.log" Nov 26 15:15:41 crc kubenswrapper[4651]: I1126 15:15:41.574419 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7fd0076a-46e8-4776-845d-f69a0679c989/nova-scheduler-scheduler/0.log" Nov 26 15:15:41 crc kubenswrapper[4651]: I1126 15:15:41.642325 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fded8231-49dd-41d6-8e30-85572ad226db/mysql-bootstrap/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.044749 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fded8231-49dd-41d6-8e30-85572ad226db/galera/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.065937 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fded8231-49dd-41d6-8e30-85572ad226db/mysql-bootstrap/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.081782 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fcd2f469-d922-4c5d-a885-517ad214a748/mysql-bootstrap/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.319809 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fcd2f469-d922-4c5d-a885-517ad214a748/mysql-bootstrap/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.352843 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fcd2f469-d922-4c5d-a885-517ad214a748/galera/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.405440 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:15:42 crc kubenswrapper[4651]: E1126 15:15:42.405653 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.480264 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a13f0157-d7dd-46c4-86cf-7397655d1e83/openstackclient/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.578642 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bp6gp_ee616f8e-429e-4224-90d8-7757d8f52ebd/openstack-network-exporter/0.log" Nov 26 15:15:42 crc kubenswrapper[4651]: I1126 15:15:42.844927 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4hsfq_e1e2255f-1898-48ec-b534-24907368820d/ovsdb-server-init/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.218869 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4hsfq_e1e2255f-1898-48ec-b534-24907368820d/ovsdb-server/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.240144 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4hsfq_e1e2255f-1898-48ec-b534-24907368820d/ovs-vswitchd/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.301315 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4hsfq_e1e2255f-1898-48ec-b534-24907368820d/ovsdb-server-init/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.510449 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zrhdf_13f26ce1-fcd6-47bf-b95d-d93e41dd795f/ovn-controller/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.613135 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3b709538-7306-4705-aafe-d0d20e151610/ovn-northd/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.671609 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3b709538-7306-4705-aafe-d0d20e151610/openstack-network-exporter/0.log" Nov 26 15:15:43 crc kubenswrapper[4651]: I1126 15:15:43.979280 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ce029284-0f6b-4827-9831-6c9b6b5cec58/openstack-network-exporter/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.009420 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ce029284-0f6b-4827-9831-6c9b6b5cec58/ovsdbserver-nb/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.324197 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d41f859-b430-492d-856b-e623f18f5df9/openstack-network-exporter/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.344236 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d41f859-b430-492d-856b-e623f18f5df9/ovsdbserver-sb/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.354339 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6749-account-create-update-5qhl6_e5a57ad0-61ac-42e0-b0e2-602914415dee/mariadb-account-create-update/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.728784 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d7dcdb968-2bhkx_50553d30-1881-42f4-9e57-224db8e5be3c/placement-api/0.log" Nov 26 15:15:44 crc kubenswrapper[4651]: I1126 15:15:44.760993 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d7dcdb968-2bhkx_50553d30-1881-42f4-9e57-224db8e5be3c/placement-log/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.033218 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-wdqdf_ccbf26af-444d-421b-8b60-4c6c343564cb/mariadb-database-create/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.074900 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-n9whp_c1259668-c013-4143-b8b4-677a639a764e/placement-db-sync/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.441113 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc026e6-8f32-45d0-bab4-c12dd93d946f/setup-container/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.824969 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc026e6-8f32-45d0-bab4-c12dd93d946f/rabbitmq/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.827350 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc026e6-8f32-45d0-bab4-c12dd93d946f/setup-container/0.log" Nov 26 15:15:45 crc kubenswrapper[4651]: I1126 15:15:45.937851 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f351a70-5e04-4270-b9bb-00586a94da1f/setup-container/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.138402 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f351a70-5e04-4270-b9bb-00586a94da1f/setup-container/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.211472 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8f351a70-5e04-4270-b9bb-00586a94da1f/rabbitmq/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.368246 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6978d54687-jsqtl_09fca043-ad27-4285-8894-522bc6cc68f4/proxy-httpd/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.550704 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6978d54687-jsqtl_09fca043-ad27-4285-8894-522bc6cc68f4/proxy-server/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.591156 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-b5b9c_7035c6b3-8bd2-4447-9a56-bee3af6dceae/swift-ring-rebalance/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.805358 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/account-reaper/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.864779 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/account-auditor/0.log" Nov 26 15:15:46 crc kubenswrapper[4651]: I1126 15:15:46.867143 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/account-replicator/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.042742 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/account-server/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.045459 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/container-auditor/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.206223 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/container-replicator/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.254444 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/container-server/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.297328 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/object-auditor/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.355347 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/container-updater/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.458096 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/object-expirer/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.573922 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/object-replicator/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.601962 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/object-server/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.680216 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/object-updater/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.773262 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/rsync/0.log" Nov 26 15:15:47 crc kubenswrapper[4651]: I1126 15:15:47.882731 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a3b8c2db-ce7f-48ce-9fd1-d55b5583773e/swift-recon-cron/0.log" Nov 26 15:15:48 crc kubenswrapper[4651]: I1126 15:15:48.995245 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_31f56fef-ac96-4560-b99c-71b77bcecd4b/memcached/0.log" Nov 26 15:15:50 crc kubenswrapper[4651]: I1126 15:15:50.207200 4651 scope.go:117] "RemoveContainer" containerID="45ba5535542b59701406caadd2410eea4b79aae4fde3b5ba66e91d74fb60bc2b" Nov 26 15:15:50 crc kubenswrapper[4651]: I1126 15:15:50.231515 4651 scope.go:117] "RemoveContainer" containerID="8c9fe5be740e9003884ecb7d4016fca9c33b14d93ae801df5652b5720280676e" Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.059470 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wdqdf"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.071442 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c181-account-create-update-prbwt"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.081783 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wdqdf"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.093136 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c181-account-create-update-prbwt"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.105820 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6749-account-create-update-5qhl6"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.120091 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6749-account-create-update-5qhl6"] Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.415549 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2667dc1-777a-469d-8021-ff0881c8d0d2" path="/var/lib/kubelet/pods/a2667dc1-777a-469d-8021-ff0881c8d0d2/volumes" Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.417070 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbf26af-444d-421b-8b60-4c6c343564cb" path="/var/lib/kubelet/pods/ccbf26af-444d-421b-8b60-4c6c343564cb/volumes" Nov 26 15:15:51 crc kubenswrapper[4651]: I1126 15:15:51.417877 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a57ad0-61ac-42e0-b0e2-602914415dee" path="/var/lib/kubelet/pods/e5a57ad0-61ac-42e0-b0e2-602914415dee/volumes" Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.028502 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ss5m4"] Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.043803 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-v6vlh"] Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.054989 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d20d-account-create-update-56vgl"] Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.068009 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ss5m4"] Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.076974 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d20d-account-create-update-56vgl"] Nov 26 15:15:52 crc kubenswrapper[4651]: I1126 15:15:52.088094 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-v6vlh"] Nov 26 15:15:53 crc kubenswrapper[4651]: I1126 15:15:53.417245 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704d880b-4c5f-4663-9bb1-8e40fa9b6752" path="/var/lib/kubelet/pods/704d880b-4c5f-4663-9bb1-8e40fa9b6752/volumes" Nov 26 15:15:53 crc kubenswrapper[4651]: I1126 15:15:53.418101 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5f579e-17c0-425c-bd60-0f4950eabdc8" path="/var/lib/kubelet/pods/9e5f579e-17c0-425c-bd60-0f4950eabdc8/volumes" Nov 26 15:15:53 crc kubenswrapper[4651]: I1126 15:15:53.418710 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb658c85-2b54-43fb-9938-0c5558ae3da8" path="/var/lib/kubelet/pods/eb658c85-2b54-43fb-9938-0c5558ae3da8/volumes" Nov 26 15:15:55 crc kubenswrapper[4651]: I1126 15:15:55.402830 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:15:55 crc kubenswrapper[4651]: E1126 15:15:55.403416 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:16:08 crc kubenswrapper[4651]: I1126 15:16:08.402420 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:16:08 crc kubenswrapper[4651]: E1126 15:16:08.403371 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.358151 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/util/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.558911 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/pull/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.568383 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/pull/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.601295 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/util/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.811474 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/pull/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.817591 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/extract/0.log" Nov 26 15:16:12 crc kubenswrapper[4651]: I1126 15:16:12.836454 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7a28072cb0ef6b990d9b21b5d5d72fe21e00c0dcce64865eec1836f82a99x9f_ca662d04-4f23-48dc-b58c-d96bb9d5073c/util/0.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.044948 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-5jb5x_ec10af15-dcf5-413d-87ef-0ca5a469b5fa/manager/1.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.084830 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-5jb5x_ec10af15-dcf5-413d-87ef-0ca5a469b5fa/manager/2.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.112245 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-5jb5x_ec10af15-dcf5-413d-87ef-0ca5a469b5fa/kube-rbac-proxy/0.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.283588 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-k4tq9_85fb4e98-47db-403d-85e3-c2550cd47160/kube-rbac-proxy/0.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.316958 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-k4tq9_85fb4e98-47db-403d-85e3-c2550cd47160/manager/2.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.344409 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-k4tq9_85fb4e98-47db-403d-85e3-c2550cd47160/manager/1.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.473017 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-q8cjf_5f58ef49-d516-48e5-a508-e4102374d111/kube-rbac-proxy/0.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.529589 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-q8cjf_5f58ef49-d516-48e5-a508-e4102374d111/manager/2.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.550363 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-q8cjf_5f58ef49-d516-48e5-a508-e4102374d111/manager/1.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.675123 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-gqj7p_6a660fe2-a185-4e56-98cb-b12cdd749964/kube-rbac-proxy/0.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.753030 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-gqj7p_6a660fe2-a185-4e56-98cb-b12cdd749964/manager/1.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.806983 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-gqj7p_6a660fe2-a185-4e56-98cb-b12cdd749964/manager/2.log" Nov 26 15:16:13 crc kubenswrapper[4651]: I1126 15:16:13.920794 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-pt9q8_e5c0812c-3183-4f45-b6b9-d4975f8bb80a/kube-rbac-proxy/0.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.000958 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-pt9q8_e5c0812c-3183-4f45-b6b9-d4975f8bb80a/manager/2.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.116849 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-pt9q8_e5c0812c-3183-4f45-b6b9-d4975f8bb80a/manager/1.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.228189 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-v89cv_eed373f0-add9-4ae8-b5cc-ed711e79b5c5/kube-rbac-proxy/0.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.246387 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-v89cv_eed373f0-add9-4ae8-b5cc-ed711e79b5c5/manager/2.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.395009 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-v89cv_eed373f0-add9-4ae8-b5cc-ed711e79b5c5/manager/1.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.498868 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-shslt_99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6/manager/2.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.527598 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-shslt_99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6/kube-rbac-proxy/0.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.648847 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-shslt_99b3b839-cb5f-4e5e-82b4-cbbb7b18ddb6/manager/1.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.704648 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-cggjs_14110a58-3dd5-4827-8a86-d4c0fc377b97/kube-rbac-proxy/0.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.747080 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-cggjs_14110a58-3dd5-4827-8a86-d4c0fc377b97/manager/3.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.846895 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-cggjs_14110a58-3dd5-4827-8a86-d4c0fc377b97/manager/2.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.915184 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-hmndm_dc5a51cf-b992-4542-8b00-2948ab513eed/manager/2.log" Nov 26 15:16:14 crc kubenswrapper[4651]: I1126 15:16:14.977152 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-hmndm_dc5a51cf-b992-4542-8b00-2948ab513eed/kube-rbac-proxy/0.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.075520 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-hmndm_dc5a51cf-b992-4542-8b00-2948ab513eed/manager/1.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.182363 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-tszf4_53400076-0e4e-4e0b-b476-d4a1fd901631/kube-rbac-proxy/0.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.224408 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-tszf4_53400076-0e4e-4e0b-b476-d4a1fd901631/manager/2.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.494820 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-tszf4_53400076-0e4e-4e0b-b476-d4a1fd901631/manager/1.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.512201 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_8a55643f-68a5-47ea-8b27-db437d3af215/kube-rbac-proxy/0.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.574585 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_8a55643f-68a5-47ea-8b27-db437d3af215/manager/2.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.664765 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-ffbs5_8a55643f-68a5-47ea-8b27-db437d3af215/manager/1.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.772351 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-8h624_8271ec0d-f8ea-4c46-984f-95572691a379/kube-rbac-proxy/0.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.864832 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-8h624_8271ec0d-f8ea-4c46-984f-95572691a379/manager/2.log" Nov 26 15:16:15 crc kubenswrapper[4651]: I1126 15:16:15.915265 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-8h624_8271ec0d-f8ea-4c46-984f-95572691a379/manager/1.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.005956 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-cnwcz_e9981be4-751d-4c74-894a-698adad4c50f/kube-rbac-proxy/0.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.084539 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-cnwcz_e9981be4-751d-4c74-894a-698adad4c50f/manager/2.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.191378 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-cnwcz_e9981be4-751d-4c74-894a-698adad4c50f/manager/1.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.212749 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-x9mdd_b24122be-246e-4dc9-a3ad-4ca2392a4660/kube-rbac-proxy/0.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.312815 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-x9mdd_b24122be-246e-4dc9-a3ad-4ca2392a4660/manager/3.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.411501 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-x9mdd_b24122be-246e-4dc9-a3ad-4ca2392a4660/manager/2.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.518645 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9_6b7bc81d-5bbe-4c1b-a512-93e75a1f7035/kube-rbac-proxy/0.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.554799 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9_6b7bc81d-5bbe-4c1b-a512-93e75a1f7035/manager/1.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.634597 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bjd2d9_6b7bc81d-5bbe-4c1b-a512-93e75a1f7035/manager/0.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.772010 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5bcdd9fbc-vsb4g_e50a607f-7a61-4a78-870a-297fa0daa977/manager/1.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.777388 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5bcdd9fbc-vsb4g_e50a607f-7a61-4a78-870a-297fa0daa977/manager/2.log" Nov 26 15:16:16 crc kubenswrapper[4651]: I1126 15:16:16.909935 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6b4f979c6c-lg95c_674eb001-765e-433a-89d6-2a82fb599a93/operator/1.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.031481 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6b4f979c6c-lg95c_674eb001-765e-433a-89d6-2a82fb599a93/operator/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.090164 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lmwxn_58c9e481-e131-494a-808a-c27aaae0ebaa/registry-server/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.267804 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-k2rdd_ce4c06a7-4bcb-4167-bec1-14a45ca24bea/manager/2.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.294455 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-k2rdd_ce4c06a7-4bcb-4167-bec1-14a45ca24bea/manager/1.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.294747 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-k2rdd_ce4c06a7-4bcb-4167-bec1-14a45ca24bea/kube-rbac-proxy/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.350476 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-269d2_a8e49781-2e0b-476d-be9f-e17f05639447/kube-rbac-proxy/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.482079 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-269d2_a8e49781-2e0b-476d-be9f-e17f05639447/manager/2.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.501249 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-269d2_a8e49781-2e0b-476d-be9f-e17f05639447/manager/1.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.563291 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wwjsd_a72e6d14-1571-4b70-b872-a4a4b0b3c242/operator/2.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.606808 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wwjsd_a72e6d14-1571-4b70-b872-a4a4b0b3c242/operator/1.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.765376 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-6kjgs_719afb5d-40c4-4fa3-b030-38c170fc7dbb/kube-rbac-proxy/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.812016 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-6kjgs_719afb5d-40c4-4fa3-b030-38c170fc7dbb/manager/2.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.831623 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-6kjgs_719afb5d-40c4-4fa3-b030-38c170fc7dbb/manager/1.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.956206 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8zvlb_66532d04-3411-4813-ae53-4d635ee98911/kube-rbac-proxy/0.log" Nov 26 15:16:17 crc kubenswrapper[4651]: I1126 15:16:17.991195 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8zvlb_66532d04-3411-4813-ae53-4d635ee98911/manager/2.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.047707 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8zvlb_66532d04-3411-4813-ae53-4d635ee98911/manager/1.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.130235 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-l2z9w_8cd427a2-9759-460e-b86e-23e08dd7ba78/kube-rbac-proxy/0.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.206214 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-l2z9w_8cd427a2-9759-460e-b86e-23e08dd7ba78/manager/1.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.210013 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-l2z9w_8cd427a2-9759-460e-b86e-23e08dd7ba78/manager/0.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.351240 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-s5dd9_e8ad6eac-027c-4615-a5dd-6facdc1db056/kube-rbac-proxy/0.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.372214 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-s5dd9_e8ad6eac-027c-4615-a5dd-6facdc1db056/manager/1.log" Nov 26 15:16:18 crc kubenswrapper[4651]: I1126 15:16:18.418147 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-s5dd9_e8ad6eac-027c-4615-a5dd-6facdc1db056/manager/2.log" Nov 26 15:16:19 crc kubenswrapper[4651]: I1126 15:16:19.044105 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6ld6v"] Nov 26 15:16:19 crc kubenswrapper[4651]: I1126 15:16:19.054282 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6ld6v"] Nov 26 15:16:19 crc kubenswrapper[4651]: I1126 15:16:19.414260 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dfcc6ac-236d-4333-9126-5ee10d1e0417" path="/var/lib/kubelet/pods/0dfcc6ac-236d-4333-9126-5ee10d1e0417/volumes" Nov 26 15:16:21 crc kubenswrapper[4651]: I1126 15:16:21.402668 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:16:21 crc kubenswrapper[4651]: E1126 15:16:21.402929 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.042600 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c56c-account-create-update-j5bf5"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.053496 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6fkm4"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.063558 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wvffz"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.077592 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bb66-account-create-update-s2ll5"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.089824 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6fkm4"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.103236 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wvffz"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.114552 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c56c-account-create-update-j5bf5"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.124622 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bb66-account-create-update-s2ll5"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.132531 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-28wwh"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.142320 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-28wwh"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.161086 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3572-account-create-update-b7cth"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.170495 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3572-account-create-update-b7cth"] Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.413626 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7b363a-a7d4-4197-b711-2d3a0b761273" path="/var/lib/kubelet/pods/0a7b363a-a7d4-4197-b711-2d3a0b761273/volumes" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.415562 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf0489a-9b4f-4cd4-95a8-42a5fd115b89" path="/var/lib/kubelet/pods/3bf0489a-9b4f-4cd4-95a8-42a5fd115b89/volumes" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.435697 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ffdfc1-d77f-4094-a0ba-2800d4c4d878" path="/var/lib/kubelet/pods/88ffdfc1-d77f-4094-a0ba-2800d4c4d878/volumes" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.442900 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d514364-b561-4d18-9b82-bfd428216060" path="/var/lib/kubelet/pods/8d514364-b561-4d18-9b82-bfd428216060/volumes" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.445416 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78599f6-a349-4abd-b862-37ea4d85818d" path="/var/lib/kubelet/pods/a78599f6-a349-4abd-b862-37ea4d85818d/volumes" Nov 26 15:16:33 crc kubenswrapper[4651]: I1126 15:16:33.448589 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46f23b6-3605-4160-a29e-b7f2a84b48f5" path="/var/lib/kubelet/pods/f46f23b6-3605-4160-a29e-b7f2a84b48f5/volumes" Nov 26 15:16:36 crc kubenswrapper[4651]: I1126 15:16:36.401909 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:16:36 crc kubenswrapper[4651]: E1126 15:16:36.402604 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:16:37 crc kubenswrapper[4651]: I1126 15:16:37.979910 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-skz22_df7a5934-8cbe-48de-badf-a0bf93119820/control-plane-machine-set-operator/0.log" Nov 26 15:16:38 crc kubenswrapper[4651]: I1126 15:16:38.057779 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-skjk9_4b814105-58ac-41b6-8b52-efa5de815233/kube-rbac-proxy/0.log" Nov 26 15:16:38 crc kubenswrapper[4651]: I1126 15:16:38.079355 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-skjk9_4b814105-58ac-41b6-8b52-efa5de815233/machine-api-operator/0.log" Nov 26 15:16:39 crc kubenswrapper[4651]: I1126 15:16:39.032938 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-72sf9"] Nov 26 15:16:39 crc kubenswrapper[4651]: I1126 15:16:39.047309 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-72sf9"] Nov 26 15:16:39 crc kubenswrapper[4651]: I1126 15:16:39.412228 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae123901-25f9-4788-b666-bcb72066c3c4" path="/var/lib/kubelet/pods/ae123901-25f9-4788-b666-bcb72066c3c4/volumes" Nov 26 15:16:48 crc kubenswrapper[4651]: I1126 15:16:48.402449 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:16:48 crc kubenswrapper[4651]: E1126 15:16:48.403359 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.315118 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qksf2_00fbc8fa-7a28-4489-9e75-5c309cc83d87/cert-manager-controller/0.log" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.321171 4651 scope.go:117] "RemoveContainer" containerID="63b6c5f574909681e2f30909c9ec4f70dc295acd9be30f0a3b390a641bfa9e84" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.353540 4651 scope.go:117] "RemoveContainer" containerID="f12a3b347d022be2751ae2ce9c080d9b7a93763a9d10ed87767a90545aec447a" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.423165 4651 scope.go:117] "RemoveContainer" containerID="9d2f12c635705e90d0fc108cd109541e25caf715d5cc945e0cedf40c55377682" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.459448 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-rnxlv_38d33193-a6f6-42e3-982d-59abd72e12f2/cert-manager-cainjector/0.log" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.485483 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-wvl8s_0ff4812f-612b-438b-8a92-7a9ff86bcdda/cert-manager-webhook/0.log" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.497803 4651 scope.go:117] "RemoveContainer" containerID="1da4d98336b2602d323333a3b2107cce160186ae704bf29529accdb4d7a714a6" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.556719 4651 scope.go:117] "RemoveContainer" containerID="f05be8fa92b62730f2c248ba7b13e7f3aafdad78aadb18091ced8d6997a062f4" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.632157 4651 scope.go:117] "RemoveContainer" containerID="a21cad0ca94a05e4d63a0013598a2ae472c411d7f7795fed14423ae73ab97c88" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.663521 4651 scope.go:117] "RemoveContainer" containerID="e7911fc74bf464bdbbbd1fa053975a83c91c74fa5c7ee1865fb6c895f6b42637" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.713752 4651 scope.go:117] "RemoveContainer" containerID="9cb99b3819cc0f226317e4c505f91704ca7220f0491a790cef76642ec169ef8f" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.746857 4651 scope.go:117] "RemoveContainer" containerID="536eddbf3a5d17779f9e0f97742827c4d08a433086a5f5c60e49071adc3f6111" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.773443 4651 scope.go:117] "RemoveContainer" containerID="12001aff69ebce8862f2b559eb46b5f5569de2a9bb0817e8d30888970aee26aa" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.794566 4651 scope.go:117] "RemoveContainer" containerID="69034c1c606c777a387ebc575a28e24076e705a9cc33a01fb844194a5080c732" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.824471 4651 scope.go:117] "RemoveContainer" containerID="02c416faca21be52dd878cec856b81bd111a0dfa3543757c666d60e9829e50a2" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.855353 4651 scope.go:117] "RemoveContainer" containerID="533ff47ca916e9c0b15ad431af874c76a9c98d1cbb32f6e2e069fdf6d346a74b" Nov 26 15:16:50 crc kubenswrapper[4651]: I1126 15:16:50.877408 4651 scope.go:117] "RemoveContainer" containerID="fe75ce8679b38adb7c5ccd3ee6a133021aa2b07458230cccece86e49f311440c" Nov 26 15:16:59 crc kubenswrapper[4651]: I1126 15:16:59.402414 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:16:59 crc kubenswrapper[4651]: E1126 15:16:59.403394 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:17:02 crc kubenswrapper[4651]: I1126 15:17:02.793477 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-lnrjj_ea25780c-5944-4cbf-a8f0-e1e3dd4617f7/nmstate-console-plugin/0.log" Nov 26 15:17:03 crc kubenswrapper[4651]: I1126 15:17:03.010691 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fz8kc_66a988fd-9365-4310-ae73-c26d1da23d30/nmstate-handler/0.log" Nov 26 15:17:03 crc kubenswrapper[4651]: I1126 15:17:03.060925 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-kl9fj_fbfcbea7-d7be-4196-9146-159b5fdc8afa/kube-rbac-proxy/0.log" Nov 26 15:17:03 crc kubenswrapper[4651]: I1126 15:17:03.200024 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-kl9fj_fbfcbea7-d7be-4196-9146-159b5fdc8afa/nmstate-metrics/0.log" Nov 26 15:17:03 crc kubenswrapper[4651]: I1126 15:17:03.270403 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-g2r5z_03019cf8-465a-4be9-b1f1-3137424cde1c/nmstate-operator/0.log" Nov 26 15:17:03 crc kubenswrapper[4651]: I1126 15:17:03.433560 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-7lwbk_8e7b8425-38cf-4ff3-967d-18555db204a9/nmstate-webhook/0.log" Nov 26 15:17:12 crc kubenswrapper[4651]: I1126 15:17:12.402062 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:17:12 crc kubenswrapper[4651]: E1126 15:17:12.402733 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:17:17 crc kubenswrapper[4651]: I1126 15:17:17.635985 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-f7fh8_46d32f11-ab11-45c9-8ba1-118d3cf10bcd/kube-rbac-proxy/0.log" Nov 26 15:17:17 crc kubenswrapper[4651]: I1126 15:17:17.693420 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-f7fh8_46d32f11-ab11-45c9-8ba1-118d3cf10bcd/controller/0.log" Nov 26 15:17:17 crc kubenswrapper[4651]: I1126 15:17:17.852922 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-frr-files/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.052864 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-frr-files/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.095313 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-metrics/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.113074 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-reloader/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.154334 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-reloader/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.322880 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-frr-files/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.387192 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-metrics/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.412977 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-metrics/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.413839 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-reloader/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.620657 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-reloader/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.626273 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-frr-files/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.685014 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/controller/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.730654 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/cp-metrics/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.843248 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/frr-metrics/0.log" Nov 26 15:17:18 crc kubenswrapper[4651]: I1126 15:17:18.929575 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/kube-rbac-proxy-frr/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.022446 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/kube-rbac-proxy/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.044199 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9glnv"] Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.062225 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9glnv"] Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.205873 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/reloader/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.376257 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-xrp9z_0d2b53ca-9ab2-4845-a2f8-eacbe6fa4e29/frr-k8s-webhook-server/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.412372 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147296af-97b7-4982-ab39-d7f3b78f042d" path="/var/lib/kubelet/pods/147296af-97b7-4982-ab39-d7f3b78f042d/volumes" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.634370 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b5d786cf6-wsrgh_f688796e-89d5-4da8-8dc7-786c5940b853/manager/3.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.641697 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbhmh_1482b73c-9ed6-4292-9061-9df617e0f312/frr/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.649296 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b5d786cf6-wsrgh_f688796e-89d5-4da8-8dc7-786c5940b853/manager/2.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.871373 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d67bf6468-cdmmw_f1830fea-fcaa-4159-a4c9-20787b409237/webhook-server/0.log" Nov 26 15:17:19 crc kubenswrapper[4651]: I1126 15:17:19.880135 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-p9nbz_05a4e64e-8b51-45a8-be15-4c081281809f/kube-rbac-proxy/0.log" Nov 26 15:17:20 crc kubenswrapper[4651]: I1126 15:17:20.485384 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-p9nbz_05a4e64e-8b51-45a8-be15-4c081281809f/speaker/0.log" Nov 26 15:17:24 crc kubenswrapper[4651]: I1126 15:17:24.402511 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:17:24 crc kubenswrapper[4651]: E1126 15:17:24.403026 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:17:32 crc kubenswrapper[4651]: I1126 15:17:32.602952 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/util/0.log" Nov 26 15:17:32 crc kubenswrapper[4651]: I1126 15:17:32.813004 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/util/0.log" Nov 26 15:17:32 crc kubenswrapper[4651]: I1126 15:17:32.839736 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/pull/0.log" Nov 26 15:17:32 crc kubenswrapper[4651]: I1126 15:17:32.880347 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/pull/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.034301 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-n9whp"] Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.051442 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-24pqd"] Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.060211 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-n9whp"] Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.072006 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-24pqd"] Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.115281 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/extract/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.161600 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/pull/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.214822 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9gjf9_83e0d76b-27c5-4f78-8f50-48beac51f214/util/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.305631 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-utilities/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.416426 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eca4c8f-cc45-46f6-8730-187af536d3b1" path="/var/lib/kubelet/pods/2eca4c8f-cc45-46f6-8730-187af536d3b1/volumes" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.419715 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1259668-c013-4143-b8b4-677a639a764e" path="/var/lib/kubelet/pods/c1259668-c013-4143-b8b4-677a639a764e/volumes" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.526142 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-content/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.566619 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-content/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.568195 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-utilities/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.769018 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-utilities/0.log" Nov 26 15:17:33 crc kubenswrapper[4651]: I1126 15:17:33.769426 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/extract-content/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.049445 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-utilities/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.084906 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z5pzs_3b82390c-5e5e-4d38-91ea-7b1a1c3820d7/registry-server/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.249715 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-utilities/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.269056 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-content/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.288101 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-content/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.430728 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-utilities/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.501879 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/extract-content/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.740977 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/util/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.762511 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsg6c_999d25b1-0acb-4cb4-bb7b-f65d770cf7e6/registry-server/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.869321 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/util/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.890121 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/pull/0.log" Nov 26 15:17:34 crc kubenswrapper[4651]: I1126 15:17:34.974352 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/pull/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.166069 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/extract/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.208629 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/util/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.232328 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6f4tc4_ca827f37-4b80-4699-91a5-8074b68a628c/pull/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.377356 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zhtwk_ab73482d-4b1c-481f-9728-36d8505e8a9b/marketplace-operator/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.414478 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-utilities/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.640395 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-utilities/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.675869 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-content/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.702982 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-content/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.902593 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-content/0.log" Nov 26 15:17:35 crc kubenswrapper[4651]: I1126 15:17:35.935594 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/extract-utilities/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.038578 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vtwvs_31d40458-c747-413d-8a81-43906c765b3d/registry-server/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.112988 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-utilities/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.341379 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-utilities/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.362633 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-content/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.379091 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-content/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.533278 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-content/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.533612 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/extract-utilities/0.log" Nov 26 15:17:36 crc kubenswrapper[4651]: I1126 15:17:36.931477 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qzt4w_5db8f2dc-6548-4271-9e83-27e1fc8ab069/registry-server/0.log" Nov 26 15:17:37 crc kubenswrapper[4651]: I1126 15:17:37.402207 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:17:37 crc kubenswrapper[4651]: E1126 15:17:37.402669 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:17:44 crc kubenswrapper[4651]: I1126 15:17:44.033947 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-p6s6f"] Nov 26 15:17:44 crc kubenswrapper[4651]: I1126 15:17:44.049977 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-p6s6f"] Nov 26 15:17:45 crc kubenswrapper[4651]: I1126 15:17:45.413750 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b" path="/var/lib/kubelet/pods/81bbd5b8-0c7e-45ab-b0fd-a097fc34ce9b/volumes" Nov 26 15:17:48 crc kubenswrapper[4651]: I1126 15:17:48.035533 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wzxcr"] Nov 26 15:17:48 crc kubenswrapper[4651]: I1126 15:17:48.044727 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wzxcr"] Nov 26 15:17:49 crc kubenswrapper[4651]: I1126 15:17:49.416249 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b39efce-2985-4f46-91a2-bb397f605c9c" path="/var/lib/kubelet/pods/0b39efce-2985-4f46-91a2-bb397f605c9c/volumes" Nov 26 15:17:51 crc kubenswrapper[4651]: I1126 15:17:51.197863 4651 scope.go:117] "RemoveContainer" containerID="24b0058da5b36097879a9dd3bfbd2e2aa5d0acde3fd564408286e8951f80181e" Nov 26 15:17:51 crc kubenswrapper[4651]: I1126 15:17:51.235923 4651 scope.go:117] "RemoveContainer" containerID="5492b1754fa8232d4cd55c7e05742c9f77be19b628a03f36d29f730caabe2475" Nov 26 15:17:51 crc kubenswrapper[4651]: I1126 15:17:51.305263 4651 scope.go:117] "RemoveContainer" containerID="6d1911b8e56de816e038b6661a78c7991b66e0a55e28c0d04c6a75bd20e26e83" Nov 26 15:17:51 crc kubenswrapper[4651]: I1126 15:17:51.364279 4651 scope.go:117] "RemoveContainer" containerID="6cb5e96d1bc453c6092225e2597e5183ffb18300836da01876f88d3898c4b4da" Nov 26 15:17:51 crc kubenswrapper[4651]: I1126 15:17:51.435722 4651 scope.go:117] "RemoveContainer" containerID="d8a8e392b5d7dad2f06ca07da9cb43f1113ca204035247a724c86fcddcc08f46" Nov 26 15:17:52 crc kubenswrapper[4651]: I1126 15:17:52.402772 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:17:52 crc kubenswrapper[4651]: E1126 15:17:52.403301 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:18:06 crc kubenswrapper[4651]: I1126 15:18:06.402128 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:18:06 crc kubenswrapper[4651]: E1126 15:18:06.403080 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:18:21 crc kubenswrapper[4651]: I1126 15:18:21.404238 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:18:21 crc kubenswrapper[4651]: E1126 15:18:21.404976 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.049012 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e077-account-create-update-9ntsd"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.061242 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-aa1b-account-create-update-zsh2r"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.079135 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jjrbk"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.089616 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-n2799"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.097817 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-aa1b-account-create-update-zsh2r"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.106024 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jjrbk"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.114269 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-n2799"] Nov 26 15:18:28 crc kubenswrapper[4651]: I1126 15:18:28.124149 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e077-account-create-update-9ntsd"] Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.041162 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-77b2-account-create-update-v7shh"] Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.049346 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7k6w2"] Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.057779 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7k6w2"] Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.065411 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-77b2-account-create-update-v7shh"] Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.420874 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e0710f-88c5-4a6a-96d5-97f4a934eeed" path="/var/lib/kubelet/pods/34e0710f-88c5-4a6a-96d5-97f4a934eeed/volumes" Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.423400 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5053f0e2-7865-4be5-9601-2c69da731509" path="/var/lib/kubelet/pods/5053f0e2-7865-4be5-9601-2c69da731509/volumes" Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.424610 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5279318-bb6d-455f-96a5-d410d0468c6b" path="/var/lib/kubelet/pods/b5279318-bb6d-455f-96a5-d410d0468c6b/volumes" Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.425944 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce0bade-306a-4aa8-bbff-a24a79d73e22" path="/var/lib/kubelet/pods/cce0bade-306a-4aa8-bbff-a24a79d73e22/volumes" Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.426562 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0af2b0-0d8a-488e-97d3-5956869cd9e9" path="/var/lib/kubelet/pods/dc0af2b0-0d8a-488e-97d3-5956869cd9e9/volumes" Nov 26 15:18:29 crc kubenswrapper[4651]: I1126 15:18:29.428654 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9edcf88-8f1d-419e-afd8-5e5861d5b5ad" path="/var/lib/kubelet/pods/e9edcf88-8f1d-419e-afd8-5e5861d5b5ad/volumes" Nov 26 15:18:33 crc kubenswrapper[4651]: I1126 15:18:33.409944 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:18:33 crc kubenswrapper[4651]: E1126 15:18:33.410546 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:18:46 crc kubenswrapper[4651]: I1126 15:18:46.403887 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:18:46 crc kubenswrapper[4651]: E1126 15:18:46.404895 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.701011 4651 scope.go:117] "RemoveContainer" containerID="dd620561a989ccabf95243d7d0c2da8791578a36ce2368d783ef2c81ba468fea" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.748849 4651 scope.go:117] "RemoveContainer" containerID="5b4f2e94eeddb1f14656a8ccf11f737869c9fe1c77ec64146502e320a265190b" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.791330 4651 scope.go:117] "RemoveContainer" containerID="1a2c81fc4664a228015b39d5625de039f2eb37b136c86f7e2a4d9d34504fa25c" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.832237 4651 scope.go:117] "RemoveContainer" containerID="407b2ba4795ba4c8ffdb266d5b919b0403ad2cfe5961c060ec6298e1621c9aa3" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.877554 4651 scope.go:117] "RemoveContainer" containerID="b3983f8284e08861786acdea87ae2b3035437f66338617f3ce18f76772a9aa6b" Nov 26 15:18:51 crc kubenswrapper[4651]: I1126 15:18:51.927303 4651 scope.go:117] "RemoveContainer" containerID="ea59fba41340cffb7387d09140bf315eea250a2fc013b18c7979fa7e3833c91c" Nov 26 15:18:58 crc kubenswrapper[4651]: I1126 15:18:58.402074 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:18:58 crc kubenswrapper[4651]: E1126 15:18:58.403060 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:19:11 crc kubenswrapper[4651]: I1126 15:19:11.403426 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:19:11 crc kubenswrapper[4651]: E1126 15:19:11.406310 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:19:12 crc kubenswrapper[4651]: I1126 15:19:12.229664 4651 generic.go:334] "Generic (PLEG): container finished" podID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerID="429ae40c99b5c5713b5e0fe50030a9d73098925b7685c218e7f330f5a06109a9" exitCode=0 Nov 26 15:19:12 crc kubenswrapper[4651]: I1126 15:19:12.229743 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdvq7/must-gather-p8cln" event={"ID":"c87e7492-30cc-4750-b99a-5ef41775cb9b","Type":"ContainerDied","Data":"429ae40c99b5c5713b5e0fe50030a9d73098925b7685c218e7f330f5a06109a9"} Nov 26 15:19:12 crc kubenswrapper[4651]: I1126 15:19:12.230468 4651 scope.go:117] "RemoveContainer" containerID="429ae40c99b5c5713b5e0fe50030a9d73098925b7685c218e7f330f5a06109a9" Nov 26 15:19:12 crc kubenswrapper[4651]: I1126 15:19:12.694968 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdvq7_must-gather-p8cln_c87e7492-30cc-4750-b99a-5ef41775cb9b/gather/0.log" Nov 26 15:19:17 crc kubenswrapper[4651]: I1126 15:19:17.082325 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shlzt"] Nov 26 15:19:17 crc kubenswrapper[4651]: I1126 15:19:17.098225 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shlzt"] Nov 26 15:19:17 crc kubenswrapper[4651]: I1126 15:19:17.414407 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20922a1a-1763-45a9-911a-161e1fc4bd1e" path="/var/lib/kubelet/pods/20922a1a-1763-45a9-911a-161e1fc4bd1e/volumes" Nov 26 15:19:20 crc kubenswrapper[4651]: I1126 15:19:20.986701 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdvq7/must-gather-p8cln"] Nov 26 15:19:20 crc kubenswrapper[4651]: I1126 15:19:20.987411 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kdvq7/must-gather-p8cln" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="copy" containerID="cri-o://6e469a2d22cf80c382e0e65a313ac0d3dfee15d99495cd3d452de19707ef77e8" gracePeriod=2 Nov 26 15:19:20 crc kubenswrapper[4651]: I1126 15:19:20.998326 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdvq7/must-gather-p8cln"] Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.331191 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdvq7_must-gather-p8cln_c87e7492-30cc-4750-b99a-5ef41775cb9b/copy/0.log" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.331567 4651 generic.go:334] "Generic (PLEG): container finished" podID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerID="6e469a2d22cf80c382e0e65a313ac0d3dfee15d99495cd3d452de19707ef77e8" exitCode=143 Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.568732 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdvq7_must-gather-p8cln_c87e7492-30cc-4750-b99a-5ef41775cb9b/copy/0.log" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.569419 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.669163 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5lrp\" (UniqueName: \"kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp\") pod \"c87e7492-30cc-4750-b99a-5ef41775cb9b\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.669345 4651 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output\") pod \"c87e7492-30cc-4750-b99a-5ef41775cb9b\" (UID: \"c87e7492-30cc-4750-b99a-5ef41775cb9b\") " Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.682276 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp" (OuterVolumeSpecName: "kube-api-access-s5lrp") pod "c87e7492-30cc-4750-b99a-5ef41775cb9b" (UID: "c87e7492-30cc-4750-b99a-5ef41775cb9b"). InnerVolumeSpecName "kube-api-access-s5lrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.776353 4651 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5lrp\" (UniqueName: \"kubernetes.io/projected/c87e7492-30cc-4750-b99a-5ef41775cb9b-kube-api-access-s5lrp\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.797815 4651 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c87e7492-30cc-4750-b99a-5ef41775cb9b" (UID: "c87e7492-30cc-4750-b99a-5ef41775cb9b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:19:21 crc kubenswrapper[4651]: I1126 15:19:21.878510 4651 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c87e7492-30cc-4750-b99a-5ef41775cb9b-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:22 crc kubenswrapper[4651]: I1126 15:19:22.341117 4651 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdvq7_must-gather-p8cln_c87e7492-30cc-4750-b99a-5ef41775cb9b/copy/0.log" Nov 26 15:19:22 crc kubenswrapper[4651]: I1126 15:19:22.341617 4651 scope.go:117] "RemoveContainer" containerID="6e469a2d22cf80c382e0e65a313ac0d3dfee15d99495cd3d452de19707ef77e8" Nov 26 15:19:22 crc kubenswrapper[4651]: I1126 15:19:22.341658 4651 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdvq7/must-gather-p8cln" Nov 26 15:19:22 crc kubenswrapper[4651]: I1126 15:19:22.362533 4651 scope.go:117] "RemoveContainer" containerID="429ae40c99b5c5713b5e0fe50030a9d73098925b7685c218e7f330f5a06109a9" Nov 26 15:19:23 crc kubenswrapper[4651]: I1126 15:19:23.417960 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" path="/var/lib/kubelet/pods/c87e7492-30cc-4750-b99a-5ef41775cb9b/volumes" Nov 26 15:19:24 crc kubenswrapper[4651]: I1126 15:19:24.401963 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:19:24 crc kubenswrapper[4651]: E1126 15:19:24.402614 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:19:35 crc kubenswrapper[4651]: I1126 15:19:35.404495 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:19:35 crc kubenswrapper[4651]: E1126 15:19:35.405291 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:19:37 crc kubenswrapper[4651]: I1126 15:19:37.036606 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tpjxc"] Nov 26 15:19:37 crc kubenswrapper[4651]: I1126 15:19:37.047332 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tpjxc"] Nov 26 15:19:37 crc kubenswrapper[4651]: I1126 15:19:37.413226 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad85fcab-3573-4019-89bc-f35413ff0a9d" path="/var/lib/kubelet/pods/ad85fcab-3573-4019-89bc-f35413ff0a9d/volumes" Nov 26 15:19:38 crc kubenswrapper[4651]: I1126 15:19:38.027709 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qmd5q"] Nov 26 15:19:38 crc kubenswrapper[4651]: I1126 15:19:38.038956 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qmd5q"] Nov 26 15:19:39 crc kubenswrapper[4651]: I1126 15:19:39.415812 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a2f131-c449-4541-9822-75711dee8ad3" path="/var/lib/kubelet/pods/86a2f131-c449-4541-9822-75711dee8ad3/volumes" Nov 26 15:19:48 crc kubenswrapper[4651]: I1126 15:19:48.402913 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:19:48 crc kubenswrapper[4651]: E1126 15:19:48.403896 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:19:52 crc kubenswrapper[4651]: I1126 15:19:52.080625 4651 scope.go:117] "RemoveContainer" containerID="685533ab88cef62222b040d35727bbfefc8d1e3e65af18434ed7296f91609871" Nov 26 15:19:52 crc kubenswrapper[4651]: I1126 15:19:52.119983 4651 scope.go:117] "RemoveContainer" containerID="e6ee1006ac27ab0156c31ea55455ea2e3575009349594dabbb480fe3092cd6ad" Nov 26 15:19:52 crc kubenswrapper[4651]: I1126 15:19:52.178917 4651 scope.go:117] "RemoveContainer" containerID="9eec96ebad56c4bde87309892af8528ac22137803befe3b50c792a3509d4efc1" Nov 26 15:20:00 crc kubenswrapper[4651]: I1126 15:20:00.402561 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:20:00 crc kubenswrapper[4651]: E1126 15:20:00.403574 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:20:14 crc kubenswrapper[4651]: I1126 15:20:14.401904 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:20:14 crc kubenswrapper[4651]: E1126 15:20:14.404136 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:20:22 crc kubenswrapper[4651]: I1126 15:20:22.037814 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-48lfs"] Nov 26 15:20:22 crc kubenswrapper[4651]: I1126 15:20:22.045585 4651 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-48lfs"] Nov 26 15:20:23 crc kubenswrapper[4651]: I1126 15:20:23.412857 4651 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41567ca0-5457-4763-a8f9-b28588b4b7b1" path="/var/lib/kubelet/pods/41567ca0-5457-4763-a8f9-b28588b4b7b1/volumes" Nov 26 15:20:28 crc kubenswrapper[4651]: I1126 15:20:28.402149 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:20:28 crc kubenswrapper[4651]: E1126 15:20:28.404900 4651 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-99mrs_openshift-machine-config-operator(1233982f-5a21-4fdd-98e0-e11b5cedc385)\"" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" Nov 26 15:20:43 crc kubenswrapper[4651]: I1126 15:20:43.417565 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:20:44 crc kubenswrapper[4651]: I1126 15:20:44.129443 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"6a928dfef1dd2e804039e872d4da8b634c4652799110b200e803b9ab8e7d3158"} Nov 26 15:20:52 crc kubenswrapper[4651]: I1126 15:20:52.300313 4651 scope.go:117] "RemoveContainer" containerID="6170cfc0482c5eb2e8d56478263f3fd5df89a467d74a4d2cbce9f90980715d2d" Nov 26 15:22:59 crc kubenswrapper[4651]: I1126 15:22:59.133411 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:22:59 crc kubenswrapper[4651]: I1126 15:22:59.135531 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:23:29 crc kubenswrapper[4651]: I1126 15:23:29.133142 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:23:29 crc kubenswrapper[4651]: I1126 15:23:29.133867 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:23:59 crc kubenswrapper[4651]: I1126 15:23:59.133329 4651 patch_prober.go:28] interesting pod/machine-config-daemon-99mrs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:23:59 crc kubenswrapper[4651]: I1126 15:23:59.134923 4651 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:23:59 crc kubenswrapper[4651]: I1126 15:23:59.135055 4651 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" Nov 26 15:23:59 crc kubenswrapper[4651]: I1126 15:23:59.135803 4651 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a928dfef1dd2e804039e872d4da8b634c4652799110b200e803b9ab8e7d3158"} pod="openshift-machine-config-operator/machine-config-daemon-99mrs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:23:59 crc kubenswrapper[4651]: I1126 15:23:59.135934 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" podUID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerName="machine-config-daemon" containerID="cri-o://6a928dfef1dd2e804039e872d4da8b634c4652799110b200e803b9ab8e7d3158" gracePeriod=600 Nov 26 15:24:00 crc kubenswrapper[4651]: I1126 15:24:00.192969 4651 generic.go:334] "Generic (PLEG): container finished" podID="1233982f-5a21-4fdd-98e0-e11b5cedc385" containerID="6a928dfef1dd2e804039e872d4da8b634c4652799110b200e803b9ab8e7d3158" exitCode=0 Nov 26 15:24:00 crc kubenswrapper[4651]: I1126 15:24:00.193019 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerDied","Data":"6a928dfef1dd2e804039e872d4da8b634c4652799110b200e803b9ab8e7d3158"} Nov 26 15:24:00 crc kubenswrapper[4651]: I1126 15:24:00.193531 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-99mrs" event={"ID":"1233982f-5a21-4fdd-98e0-e11b5cedc385","Type":"ContainerStarted","Data":"8bc5b0ca7362880e1adb4cd4d51906dbb044a4f65557eca2c2ce6d46f5ee1b34"} Nov 26 15:24:00 crc kubenswrapper[4651]: I1126 15:24:00.193550 4651 scope.go:117] "RemoveContainer" containerID="9f2e05e84c06eac20301d8e6763adf30400f5de362c37bfdf55b9dd12de62e14" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.237357 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7c7qn"] Nov 26 15:24:16 crc kubenswrapper[4651]: E1126 15:24:16.238392 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="gather" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238408 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="gather" Nov 26 15:24:16 crc kubenswrapper[4651]: E1126 15:24:16.238424 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29cdb444-80a3-488d-ab43-029c8e41210e" containerName="collect-profiles" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238431 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="29cdb444-80a3-488d-ab43-029c8e41210e" containerName="collect-profiles" Nov 26 15:24:16 crc kubenswrapper[4651]: E1126 15:24:16.238459 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="copy" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238467 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="copy" Nov 26 15:24:16 crc kubenswrapper[4651]: E1126 15:24:16.238486 4651 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16215dc1-5ab5-4e53-9b79-25c52f86147c" containerName="container-00" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238492 4651 state_mem.go:107] "Deleted CPUSet assignment" podUID="16215dc1-5ab5-4e53-9b79-25c52f86147c" containerName="container-00" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238712 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="16215dc1-5ab5-4e53-9b79-25c52f86147c" containerName="container-00" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238728 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="gather" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238738 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87e7492-30cc-4750-b99a-5ef41775cb9b" containerName="copy" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.238751 4651 memory_manager.go:354] "RemoveStaleState removing state" podUID="29cdb444-80a3-488d-ab43-029c8e41210e" containerName="collect-profiles" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.240390 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.261045 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c7qn"] Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.404073 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss82q\" (UniqueName: \"kubernetes.io/projected/78ba1908-cfe7-49cb-ae9f-befeb54c878c-kube-api-access-ss82q\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.404243 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-utilities\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.404295 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-catalog-content\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.505811 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-utilities\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.505877 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-catalog-content\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.505909 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss82q\" (UniqueName: \"kubernetes.io/projected/78ba1908-cfe7-49cb-ae9f-befeb54c878c-kube-api-access-ss82q\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.507465 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-utilities\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.507684 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78ba1908-cfe7-49cb-ae9f-befeb54c878c-catalog-content\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.536897 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss82q\" (UniqueName: \"kubernetes.io/projected/78ba1908-cfe7-49cb-ae9f-befeb54c878c-kube-api-access-ss82q\") pod \"redhat-marketplace-7c7qn\" (UID: \"78ba1908-cfe7-49cb-ae9f-befeb54c878c\") " pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:16 crc kubenswrapper[4651]: I1126 15:24:16.562936 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:17 crc kubenswrapper[4651]: I1126 15:24:17.069704 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c7qn"] Nov 26 15:24:17 crc kubenswrapper[4651]: W1126 15:24:17.078007 4651 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ba1908_cfe7_49cb_ae9f_befeb54c878c.slice/crio-f64ba617440c7b796ed501ccdb98dc25d19b941cabd6a164f171ffbea061e23f WatchSource:0}: Error finding container f64ba617440c7b796ed501ccdb98dc25d19b941cabd6a164f171ffbea061e23f: Status 404 returned error can't find the container with id f64ba617440c7b796ed501ccdb98dc25d19b941cabd6a164f171ffbea061e23f Nov 26 15:24:17 crc kubenswrapper[4651]: I1126 15:24:17.334756 4651 generic.go:334] "Generic (PLEG): container finished" podID="78ba1908-cfe7-49cb-ae9f-befeb54c878c" containerID="64c881d8f1d650822a5e6a75c5ea44828edd42addecf7e70f6892997ad99e115" exitCode=0 Nov 26 15:24:17 crc kubenswrapper[4651]: I1126 15:24:17.335026 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c7qn" event={"ID":"78ba1908-cfe7-49cb-ae9f-befeb54c878c","Type":"ContainerDied","Data":"64c881d8f1d650822a5e6a75c5ea44828edd42addecf7e70f6892997ad99e115"} Nov 26 15:24:17 crc kubenswrapper[4651]: I1126 15:24:17.335093 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c7qn" event={"ID":"78ba1908-cfe7-49cb-ae9f-befeb54c878c","Type":"ContainerStarted","Data":"f64ba617440c7b796ed501ccdb98dc25d19b941cabd6a164f171ffbea061e23f"} Nov 26 15:24:17 crc kubenswrapper[4651]: I1126 15:24:17.336572 4651 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.358886 4651 generic.go:334] "Generic (PLEG): container finished" podID="78ba1908-cfe7-49cb-ae9f-befeb54c878c" containerID="852694c068976b0e8e219adea4883f50ea212bf754aadab26f3a4f2ecd048cba" exitCode=0 Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.358948 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c7qn" event={"ID":"78ba1908-cfe7-49cb-ae9f-befeb54c878c","Type":"ContainerDied","Data":"852694c068976b0e8e219adea4883f50ea212bf754aadab26f3a4f2ecd048cba"} Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.834254 4651 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qtsl"] Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.836789 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.852323 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qtsl"] Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.977877 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbl8\" (UniqueName: \"kubernetes.io/projected/dda78840-9aa5-4236-b49f-ddfdc1898cc4-kube-api-access-bhbl8\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.977970 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-utilities\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:19 crc kubenswrapper[4651]: I1126 15:24:19.978083 4651 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-catalog-content\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.079463 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-catalog-content\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.079833 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbl8\" (UniqueName: \"kubernetes.io/projected/dda78840-9aa5-4236-b49f-ddfdc1898cc4-kube-api-access-bhbl8\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.079879 4651 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-utilities\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.080991 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-utilities\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.081206 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dda78840-9aa5-4236-b49f-ddfdc1898cc4-catalog-content\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.099577 4651 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbl8\" (UniqueName: \"kubernetes.io/projected/dda78840-9aa5-4236-b49f-ddfdc1898cc4-kube-api-access-bhbl8\") pod \"certified-operators-8qtsl\" (UID: \"dda78840-9aa5-4236-b49f-ddfdc1898cc4\") " pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.226327 4651 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qtsl" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.404186 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7c7qn" event={"ID":"78ba1908-cfe7-49cb-ae9f-befeb54c878c","Type":"ContainerStarted","Data":"d292f64c4ce199d90dbc5b99037d9d9d2fe85efa0d8b1ea393ef322bcaa9aed3"} Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.427595 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7c7qn" podStartSLOduration=1.959667724 podStartE2EDuration="4.427572625s" podCreationTimestamp="2025-11-26 15:24:16 +0000 UTC" firstStartedPulling="2025-11-26 15:24:17.336363694 +0000 UTC m=+2024.762111298" lastFinishedPulling="2025-11-26 15:24:19.804268585 +0000 UTC m=+2027.230016199" observedRunningTime="2025-11-26 15:24:20.426491176 +0000 UTC m=+2027.852238790" watchObservedRunningTime="2025-11-26 15:24:20.427572625 +0000 UTC m=+2027.853320229" Nov 26 15:24:20 crc kubenswrapper[4651]: I1126 15:24:20.876402 4651 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qtsl"] Nov 26 15:24:21 crc kubenswrapper[4651]: I1126 15:24:21.417453 4651 generic.go:334] "Generic (PLEG): container finished" podID="dda78840-9aa5-4236-b49f-ddfdc1898cc4" containerID="613a8559996281c7ba0ee5e7351895ebf8e3ed5289f0bfb54c717029db3e5c94" exitCode=0 Nov 26 15:24:21 crc kubenswrapper[4651]: I1126 15:24:21.417716 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qtsl" event={"ID":"dda78840-9aa5-4236-b49f-ddfdc1898cc4","Type":"ContainerDied","Data":"613a8559996281c7ba0ee5e7351895ebf8e3ed5289f0bfb54c717029db3e5c94"} Nov 26 15:24:21 crc kubenswrapper[4651]: I1126 15:24:21.417774 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qtsl" event={"ID":"dda78840-9aa5-4236-b49f-ddfdc1898cc4","Type":"ContainerStarted","Data":"a0ec2fe51062f7782728ba6cc7a54b9787c630757ca73b4542971d43a8906a2d"} Nov 26 15:24:22 crc kubenswrapper[4651]: I1126 15:24:22.428915 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qtsl" event={"ID":"dda78840-9aa5-4236-b49f-ddfdc1898cc4","Type":"ContainerStarted","Data":"6971b858864aa2ab6f558283c62d9c0de78bcd118413b6b9eda4d9405ad2e8d0"} Nov 26 15:24:24 crc kubenswrapper[4651]: I1126 15:24:24.456503 4651 generic.go:334] "Generic (PLEG): container finished" podID="dda78840-9aa5-4236-b49f-ddfdc1898cc4" containerID="6971b858864aa2ab6f558283c62d9c0de78bcd118413b6b9eda4d9405ad2e8d0" exitCode=0 Nov 26 15:24:24 crc kubenswrapper[4651]: I1126 15:24:24.456619 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qtsl" event={"ID":"dda78840-9aa5-4236-b49f-ddfdc1898cc4","Type":"ContainerDied","Data":"6971b858864aa2ab6f558283c62d9c0de78bcd118413b6b9eda4d9405ad2e8d0"} Nov 26 15:24:26 crc kubenswrapper[4651]: I1126 15:24:26.489192 4651 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qtsl" event={"ID":"dda78840-9aa5-4236-b49f-ddfdc1898cc4","Type":"ContainerStarted","Data":"2fd82548a328ab11e43246c8a2ebc953edbb049e5c48dd43377ad482191e9da3"} Nov 26 15:24:26 crc kubenswrapper[4651]: I1126 15:24:26.513204 4651 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qtsl" podStartSLOduration=3.4815478 podStartE2EDuration="7.513183606s" podCreationTimestamp="2025-11-26 15:24:19 +0000 UTC" firstStartedPulling="2025-11-26 15:24:21.423731931 +0000 UTC m=+2028.849479535" lastFinishedPulling="2025-11-26 15:24:25.455367727 +0000 UTC m=+2032.881115341" observedRunningTime="2025-11-26 15:24:26.511639793 +0000 UTC m=+2033.937387417" watchObservedRunningTime="2025-11-26 15:24:26.513183606 +0000 UTC m=+2033.938931220" Nov 26 15:24:26 crc kubenswrapper[4651]: I1126 15:24:26.563829 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:26 crc kubenswrapper[4651]: I1126 15:24:26.564221 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:26 crc kubenswrapper[4651]: I1126 15:24:26.609111 4651 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:27 crc kubenswrapper[4651]: I1126 15:24:27.553690 4651 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7c7qn" Nov 26 15:24:27 crc kubenswrapper[4651]: I1126 15:24:27.814090 4651 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7c7qn"] Nov 26 15:24:29 crc kubenswrapper[4651]: I1126 15:24:29.512805 4651 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7c7qn" podUID="78ba1908-cfe7-49cb-ae9f-befeb54c878c" containerName="registry-server" containerID="cri-o://d292f64c4ce199d90dbc5b99037d9d9d2fe85efa0d8b1ea393ef322bcaa9aed3" gracePeriod=2